Get alerts for new jobs matching your selected skills, preferred locations, and experience range. Manage Job Alerts
12.0 - 18.0 years
0 Lacs
Trivandrum, Kerala, India
On-site
Role Description Observability Engineer Experience: 12 to 18 Years We are looking for an experienced Observability Engineer to architect, implement, and maintain enterprise-grade monitoring solutions. This role demands deep expertise across observability tools, infrastructure monitoring, log analytics, and cloud-native environments. Key Responsibilities Architect and manage observability frameworks using LogicMonitor, ServiceNow, BigPanda, and NiFi Implement log analytics and security monitoring with Azure Log Analytics and Azure Sentinel Build real-time dashboards using KQL, Splunk, and Grafana suite (Alloy, Beyla, K6, Loki, Thanos, Tempo) Lead infrastructure observability strategy for AKS (Azure Kubernetes Service) Automate observability workflows with PowerShell, GitHub, and API Management Collaborate with DevOps, cloud, and platform teams to ensure end-to-end system visibility and performance Core Skills Required Monitoring & ing: LogicMonitor, BigPanda, ServiceNow, NiFi Log Analytics & SIEM: Azure Log Analytics, Azure Sentinel, KQL Dashboards & Visualization: Grafana suite, Splunk Cloud & Containers: AKS, Data Pipelines Automation & DevOps: GitHub, PowerShell, API Management Preferred Skills Working knowledge of Cribl for log routing and filtering Familiarity with distributed tracing, advanced metrics, and telemetry strategies Soft Skills & Expectations Strong cross-functional collaboration and stakeholder communication Ability to drive initiatives independently and mentor junior engineers Eagerness to explore and adopt emerging observability tools and trends Skills big panda,Azure Automation,Azure Show more Show less
Posted 2 months ago
12.0 years
0 Lacs
Pune, Maharashtra, India
On-site
Come work at a place where innovation and teamwork come together to support the most exciting missions in the world! Director of Engineering, Connectors and Partner Integrations About The Role We are seeking an experienced and strategic Director of Engineering for Connectors and Platform Integrations to lead and scale our efforts in building high-impact integrations across cloud platforms, third-party applications (on-premises and cloud), security tools, and partner ecosystems. This role is crucial in enhancing the interoperability of the Qualys Enterprise TruRisk Platform with the broader security and IT operations ecosystem. You will lead multiple engineering teams focused on developing scalable connectors, APIs, SDKs, and integration solutions that empower our customers to extract maximum value from the Qualys Enterprise TruRisk Platform. The successful candidate will have a proven track record in building and managing high impact connectors in cybersecurity. Key Responsibilities Lead engineering efforts for developing and maintaining connectors and integrations with third-party platforms, including cloud providers (AWS, Azure, GCP), security tools, ITSM systems, and other enterprise applications. Build and foster strong technical partnerships with vendors, technology partners, and integration collaborators to expand the Qualys Enterprise TruRisk Platform ecosystem. Collaborate with Cross-Functional Engineering Teams, Product Management, Solution Architects, and Sales Engineering teams to define integration strategies and prioritize development based on customer needs and strategic initiatives. Oversee the architecture and delivery of integration components to ensure they meet performance, scalability, and security requirements. Manage, mentor, and scale high-performing engineering teams, focusing on execution, innovation, and excellence. Own the roadmap and execution for integration-related initiatives, ensuring on-time delivery and alignment with business goals. Act as a senior technical leader, driving engineering best practices and fostering a culture of continuous improvement and collaboration. Represent Qualys in partner-facing engagements, technical workshops, and integration strategy meetings. Qualifications 12+ years of experience in software engineering, with at least 5+ years in a senior leadership role. Proven track record in building and delivering enterprise-scale platform integrations and connectors for technologies such as SIEM, SOAR, CMDB, Ticketing Systems and ITSM integrations Strong experience working with cloud providers (AWS, Azure, GCP), RESTful APIs, webhooks, message brokers, and modern integration frameworks (Apache Camel, Apache NiFi). Knowledge of API gateways, authentication protocols (OAuth, SAML), and integration security best practices. Familiarity with data normalization, transformation, and sync mechanisms with Connector development Deep understanding of partner ecosystem management, including collaboration, co-development, and joint delivery. Exposure to working with partner certification programs Excellent stakeholder management and communication skills; able to bridge technical and business conversations across internal and external teams. Demonstrated ability to lead cross-functional initiatives and manage engineering teams in a distributed and agile environment. Expertise with programming languages such as Java Bachelor’s or Master’s degree in Computer Science, Engineering, or related field. Show more Show less
Posted 2 months ago
0 years
0 Lacs
Indore, Madhya Pradesh, India
On-site
What is your Role? You will work in a multi-functional role with a combination of expertise in System and Hadoop administration. You will work in a team that often interacts with customers on various aspects related to technical support for deployed system. You will be deputed at customer premises to assist customers for issues related to System and Hadoop administration. You will Interact with QA and Engineering team to co-ordinate issue resolution within the promised SLA to customer. What will you do? Deploying and administering Hortonworks, Cloudera, Apache Hadoop/Spark ecosystem. Installing Linux Operating System and Networking. Writing Unix SHELL/Ansible Scripting for automation. Maintaining core components such as Zookeeper, Kafka, NIFI, HDFS, YARN, REDIS, SPARK, HBASE etc. Takes care of the day-to-day running of Hadoop clusters using Ambari/Cloudera manager/Other monitoring tools, ensuring that the Hadoop cluster is up and running all the time. Maintaining HBASE Clusters and capacity planning. Maintaining SOLR Cluster and capacity planning. Work closely with the database team, network team and application teams to make sure that all the big data applications are highly available and performing as expected. Manage KVM Virtualization environment. What skills you should have? Technical Domain: Linux administration, Hadoop Infrastructure and Administration, SOLR, Configuration Management (Ansible etc). Show more Show less
Posted 2 months ago
8.0 years
0 Lacs
Pune, Maharashtra, India
On-site
Role: Lead/Senior Big Data Engineer Type: Full-time Location: Pune Exp: 8+ years Must Notice Period: Only Immediate joiners To be successful in this role, you should possess Collaborate closely with Product Management and Engineering leadership to devise and build the right solution. Participate in Design discussions and brainstorming sessions to select, integrate, and maintain Big Data tools and frameworks required to solve Big Data problems at scale. Design and implement systems to cleanse, process, and analyze large data sets using distributed processing tools like Akka and Spark. Understanding and critically reviewing existing data pipelines, and coming up with ideas in collaboration with Technical Leaders and Architects to improve upon current bottlenecks Take initiatives, and show the drive to pick up new stuff proactively, and work as a Senior Individual contributor on the multiple products and features we have. 7+ years of experience in developing highly scalable Big Data pipelines. In-depth understanding of the Big Data ecosystem including processing frameworks like Spark, Akka, Storm, and Hadoop, and the file types they deal with. Experience with ETL and Data pipeline tools like Apache NiFi, Airflow etc. Excellent coding skills in Java or Scala, including the understanding to apply appropriate Design Patterns when required. Experience with Git and build tools like Gradle/Maven/SBT. Strong understanding of object-oriented design, data structures, algorithms, profiling, and optimization. Have elegant, readable, maintainable and extensible code style. You are someone who would easily be able to Work closely with the US and India engineering teams to help build the Java/Scala based data pipelines Lead the India engineering team in technical excellence and ownership of critical modules; own the development of new modules and features Troubleshoot live production server issues. Handle client coordination and be able to work as a part of a team, be able to contribute independently and drive the team to exceptional contributions with minimal team supervision Follow Agile methodology, JIRA for work planning, issue management/tracking Additional Project/Soft Skills: Should be able to work independently with India & US based team members. Strong verbal and written communication with ability to articulate problems and solutions over phone and emails. Strong sense of urgency, with a passion for accuracy and timeliness. Ability to work calmly in high pressure situations and manage multiple projects/tasks. Ability to work independently and possess superior skills in issue resolution. Should have the passion to learn and implement, analyse and troubleshoot issues Show more Show less
Posted 2 months ago
5.0 years
0 Lacs
Mohali, Punjab
On-site
Job Information Date Opened 05/28/2025 Job Type Full time Industry Education Work Experience 5+ years City S.A.S.Nagar (Mohali) State/Province Punjab Country India Zip/Postal Code 160062 Job Description About Data Couch Pvt. Ltd. Data Couch Pvt. Ltd. is a leading consulting and enterprise training company, specializing in Data Engineering, Big Data, Cloud Technologies, DevOps, and AI/ML. With a footprint across India and strategic global client partnerships, we empower organizations through data-driven decision-making and digital transformation. Our team of expert consultants and trainers works hands-on with the latest technologies to deliver enterprise-grade solutions and skill-building programs. Key Responsibilities Design, develop, and optimize scalable data pipelines using PySpark. Leverage Hadoop ecosystem tools (e.g., HDFS, Hive) for big data processing. Build and manage data workflows on cloud platforms (AWS, GCP, Azure). Utilize Kubernetes for orchestration of containerized data services. Collaborate with data scientists, analysts, and engineers to integrate data solutions. Monitor, maintain, and improve data workflows for performance and reliability. Ensure data governance, compliance, and best practices in documentation. Support MLOps and the integration of AI/ML pipelines into data workflows. Requirements Must-Haves: 6–7+ years of experience in data engineering roles. Strong expertise in PySpark for ETL, transformation, and distributed data processing. Hands-on experience with at least one major cloud platform (AWS, GCP, Azure). Solid understanding of Hadoop tools like HDFS, Hive, etc. Experience with Kubernetes for container orchestration. Proficient in Python and SQL. Experience working with large-scale, distributed data systems. Good-to-Have: Familiarity with tools like Apache Airflow, Kafka, NiFi, or Databricks. Experience with cloud data warehouses such as Snowflake, Redshift, or BigQuery. Exposure to MLOps frameworks such as MLflow, TensorFlow, or PyTorch. Understanding of DevOps, CI/CD, and version control (Git, GitLab, Jenkins). Benefits Innovative Environment: Thrive in a collaborative culture that values innovation and continuous improvement. Learning & Growth: Access to internal training programs, certifications, and mentorship from industry experts. Career Development: Structured growth opportunities and competitive compensation. Cutting-Edge Tech: Work with modern data technologies and contribute to digital transformation initiatives. Health & Wellbeing: Comprehensive medical insurance coverage for you. Work-Life Balance: Generous paid leave .
Posted 2 months ago
1.0 - 5.0 years
3 - 7 Lacs
Mumbai
Work from Office
Data Engineer identifies the business problem and translates these to data services and engineering outcomes. You will deliver data solutions that empower better decision making and flexibility of your solution that scales to respond to broader business questions. key responsibilities As a Data Engineer, you are a full-stack data engineer that loves solving business problems. You work with business leads, analysts and data scientists to understand the business domain and engage with fellow engineers to build data products that empower better decision making. You are passionate about data quality of our business metrics and flexibility of your solution that scales to respond to broader business questions. If you love to solve problems using your skills, then come join the Team Searce. We have a casual and fun office environment that actively steers clear of rigid "corporate" culture, focuses on productivity and creativity, and allows you to be part of a world-class team while still being yourself. Consistently strive to acquire new skills on Cloud, DevOps, Big Data, AI and ML Understand the business problem and translate these to data services and engineering outcomes Explore new technologies and learn new techniques to solve business problems creatively Think big! and drive the strategy for better data quality for the customers Collaborate with many teams - engineering and business, to build better data products preferred qualifications Over 1-2 years of experience with Hands-on experience of any one programming language (Python, Java, Scala) Understanding of SQL is must Big data (Hadoop, Hive, Yarn, Sqoop) MPP platforms (Spark, Pig, Presto) Data-pipeline & schedular tool (Ozzie, Airflow, Nifi) Streaming engines (Kafka, Storm, Spark Streaming) Any Relational database or DW experience Any ETL tool experience Hands-on experience in pipeline design, ETL and application development Good communication skills Experience in working independently and strong analytical skills Dependable and good team player Desire to learn and work with new technologies Automation in your blood
Posted 2 months ago
4.0 years
0 Lacs
Gurugram, Haryana, India
On-site
Company Description Airtel is a leading global connectivity provider dedicated to unlocking endless opportunities for its consumers. With a focus on innovation and creativity, Airtel delivers impactful solutions and cutting-edge technologies like 5G, IoT, IQ, and Airtel Black. Committed to a 'Digital India' vision, Airtel serves millions of individuals with a subscriber base exceeding 574 million as of 2023. Role Description This is a full-time on-site Big Data Architect role located in Gurugram at Airtel. The Big Data Architect will be responsible for designing, implementing, and maintaining the company's big data infrastructure. This role involves analyzing data requirements, creating data models, and ensuring data security and integrity. Responsibilities · Must be capable of handling existing or new Apache HDFS cluster having name node, data node & edge node commissioning & decommissioning. · Work closely with data architects and analysts to design technical solutions. · Integrate and ingest data from multiple source systems into big data environments. · Develop end-to-end data transformations and workflows, ensuring logging and recovery mechanisms. · Must able to troubleshoot spark job failures. · Design and implement batch, real-time, and near-real-time data pipelines. · Optimize Big Data transformations using Apache Spark, Hive, and Tez · Work with Data Science teams to enhance actionable insights. Ensure seamless data integration and transformation across multiple systems. Qualifications · 4+ year’s of mandatory experience with Big data · 4+ year’s mandatory experience in Apache Spark. · Proficiency in Apache Spark, Hive on Tez, and Hadoop ecosystem components. · Strong coding skills in Python & Pyspark. · Experience building reusable components or frameworks using Spark · Expertise in data ingestion from multiple sources using APIs, HDFS, and NiFi. · Solid experience working with structured, unstructured, and semi-structured data formats (Text, JSON, Avro, Parquet, ORC, etc.). · Experience with UNIX Bash scripting and databases like Postgres, MySQL and Oracle. · Ability to design, develop, and evolve fault-tolerant distributed systems. · Strong SQL skills, with expertise in Hive, Impala, Mongo and NoSQL databases. · Hands-on with Git and CI/CD tools · Experience with streaming data technologies (Kafka, Spark Streaming, Apache Flink, etc.). · Proficient with HDFS, or similar data lake technologies · Excellent problem-solving skills — you will be evaluated through coding rounds Show more Show less
Posted 2 months ago
0 years
0 Lacs
Pune, Maharashtra, India
On-site
Our Purpose Mastercard powers economies and empowers people in 200+ countries and territories worldwide. Together with our customers, we’re helping build a sustainable economy where everyone can prosper. We support a wide range of digital payments choices, making transactions secure, simple, smart and accessible. Our technology and innovation, partnerships and networks combine to deliver a unique set of products and services that help people, businesses and governments realize their greatest potential. Title And Summary Lead Product Manager- Technical Who is Mastercard? Mastercard is a global technology company in the payments industry. Our mission is to connect and power an inclusive, digital economy that benefits everyone, everywhere by making transactions safe, simple, smart, and accessible. Using secure data and networks, partnerships and passion, our innovations and solutions help individuals, financial institutions, governments, and businesses realize their greatest potential. Our decency quotient, or DQ, drives our culture and everything we do inside and outside of our company. With connections across more than 210 countries and territories, we are building a sustainable world that unlocks priceless possibilities for all. Overview: We are looking for an Lead Product Manager - Technical to join a to drive our strategy forward by consistently innovating and problem-solving. The ideal candidate is passionate about technology, highly motivated, intellectually curious, analytical, and possesses an entrepreneurial mindset. the PVS Identity Solutions team. Role Provide technical leadership for new major initiatives Deliver innovative, cost-effective solutions which align to enterprise standards Be an integrated part of an Agile engineering team, working interactively with software engineer leads, architects, testing engineers, from the beginning of the development cycle Help ensure functionality delivered in each release is fully tested end to end Manage multiple priorities and tasks in a dynamic work environment. Identifies issues that will keep the platform features from delivering on time and/or with the desired requirements and communicates to leadership Works with internal teams and customer service to identify, classify, and prioritize feature-level customer issues Coordinates internal forums to collect and identify feature-level development opportunities Owns and manages product documentation; enables self-service support and/or works to reduce overhead Identifies feature risks from business and customer feedback and in-depth analysis of operational performance; shares with senior leadership Digests business customer requirements (user stories, use cases) and platform requirements for a platform feature set Determines release goals for the platform and prioritizes assigned features according to business and platform value, adjusting throughout implementation as needed Reviews product demo with the development team against acceptance criteria for the feature set All About You Bachelor’s degree in computer science or equivalent work experience with hands on technical and quality engineering skills Excellent technical acumen, strong organizational and problem-solving skills with great attention to detail, critical thinking, solid communication, and proven leadership skills Solid leadership and mentoring skills with the ability to drive change Knowledge Java, SQLs, APIs (REST/SOAP), code reviews, scanning tools and configuration, and branching techniques Experience with application monitoring tools such as Dynatrace and Splunk Experience with Chaos, software security, and crypto testing practices Experience with DevOps practices (continuous integration and delivery, and tools such as Jenkins) Nice to have knowledge or prior experience with any of the following Orchestration with Apache Nifi, Apache Airflow Understanding about Microservices architecture. Take the time to fully learn the functionality, architecture, dependencies, and runtime properties of the systems supporting your platform products. This includes the business requirements and associated use cases, Mastercard customer's experience, Mastercard's back office systems, the technical stack (application/service architecture), interfaces and associated data flows, dependent applications/services, runtime operations (i.e. trouble management/associated support strategies), and maintenance. Understands and can explain the business context and the associated customer use cases Proficient at grooming user stories, setting entrance/exit criteria and prioritizing a platform product backlog Understands the technologies supporting the platform product and are able to hold your own in debates with other PM-Ts, TPMs, SDEs, and SPMs Recognize discordant views and take part in constructive dialog to resolve them Verbal and written communication is clear and concise Improve team processes that accelerate delivery, drive innovation, lower costs, and improve quality Corporate Security Responsibility Every Person Working For, Or On Behalf Of, Mastercard Is Responsible For Information Security. All Activities Involving Access To Mastercard Assets, Information, And Networks Comes With An Inherent Risk To The Organization And Therefore, It Is Expected That The Successful Candidate For This Position Must: Abide by Mastercard’s security policies and practices; Ensure the confidentiality and integrity of the information being accessed; Report any suspected information security violation or breach, and Complete all periodic mandatory security trainings in accordance with Mastercard’s guidelines. Corporate Security Responsibility All Activities Involving Access To Mastercard Assets, Information, And Networks Comes With An Inherent Risk To The Organization And, Therefore, It Is Expected That Every Person Working For, Or On Behalf Of, Mastercard Is Responsible For Information Security And Must: Abide by Mastercard’s security policies and practices; Ensure the confidentiality and integrity of the information being accessed; Report any suspected information security violation or breach, and Complete all periodic mandatory security trainings in accordance with Mastercard’s guidelines. R-249228 Show more Show less
Posted 2 months ago
0 years
0 Lacs
Pune, Maharashtra, India
On-site
Our Purpose Mastercard powers economies and empowers people in 200+ countries and territories worldwide. Together with our customers, we’re helping build a sustainable economy where everyone can prosper. We support a wide range of digital payments choices, making transactions secure, simple, smart and accessible. Our technology and innovation, partnerships and networks combine to deliver a unique set of products and services that help people, businesses and governments realize their greatest potential. Title And Summary Lead Software Test Engineer (Automation Tester) Lead SDET Who is Mastercard? Mastercard is a global technology company in the payments industry. Our mission is to connect and power an inclusive, digital economy that benefits everyone, everywhere by making transactions safe, simple, smart, and accessible. Using secure data and networks, partnerships and passion, our innovations and solutions help individuals, financial institutions, governments, and businesses realize their greatest potential. Our decency quotient, or DQ, drives our culture and everything we do inside and outside of our company. With connections across more than 210 countries and territories, we are building a sustainable world that unlocks priceless possibilities for all. Job Overview: As part of an exciting, fast paced environment developing payment authentication and security solutions, this position will provide technical leadership and expertise within the development lifecycle for ecommerce payment authentication platform under Authentication program for Digital Overview: We are looking for an Automation Tester to join the PVS Identity Solutions team. This is a pivotal role, responsible for QA, Loading Testing and Automation various data-driven pipelines. The position involves managing testing infrastructure for Functional test, Automation and co-ordination of testing that spans multiple programs and projects. The ideal candidate will have experience working with large-scale data and automation testing of Java, Cloud Native application/services. Position will lead the development and maintenance of automated testing frameworks Provide technical leadership for new major initiatives Deliver innovative, cost-effective solutions which align to enterprise standards Drive the reduction of time spent testing Work to minimize manual testing by identifying high-ROI test cases and automating them Be an integrated part of an Agile engineering team, working interactively with software engineer leads, architects, testing engineers, and product managers from the beginning of the development cycle Help ensure functionality delivered in each release is fully tested end to end Manage multiple priorities and tasks in a dynamic work environment All About You Bachelor’s degree in computer science or equivalent work experience with hands on technical and quality engineering skills Expertise in testing methods, standards, and conventions including automation and test case creation Excellent technical acumen, strong organizational and problem-solving skills with great attention to detail, critical thinking, solid communication, and proven leadership skills Solid leadership and mentoring skills with the ability to drive change Experience in testing ETL processes Experience in Testing Automation Frameworks and agile Knowledge of Python/Hadoop/Spark, Java, SQLs, APIs (REST/SOAP), code reviews, scanning tools and configuration, and branching techniques Experience with application monitoring tools such as Dynatrace and Splunk Experience with Chaos, software security, and crypto testing practices Experience with Performance Testing Experience with DevOps practices (continuous integration and delivery, and tools such as Jenkins) Nice to have knowledge or prior experience with any of the following Apache Kafka, Apache Spark with Scala Orchestration with Apache Nifi, Apache Airflow Microservices architecture Build tools like Jenkins Good to have - Mobile Testing skills Working with large data sets with terabytes of data Corporate Security Responsibility Every Person Working For, Or On Behalf Of, Mastercard Is Responsible For Information Security. All Activities Involving Access To Mastercard Assets, Information, And Networks Comes With An Inherent Risk To The Organization And Therefore, It Is Expected That The Successful Candidate For This Position Must: Abide by Mastercard’s security policies and practices; Ensure the confidentiality and integrity of the information being accessed; Report any suspected information security violation or breach, and Complete all periodic mandatory security trainings in accordance with Mastercard’s guidelines. Corporate Security Responsibility All Activities Involving Access To Mastercard Assets, Information, And Networks Comes With An Inherent Risk To The Organization And, Therefore, It Is Expected That Every Person Working For, Or On Behalf Of, Mastercard Is Responsible For Information Security And Must: Abide by Mastercard’s security policies and practices; Ensure the confidentiality and integrity of the information being accessed; Report any suspected information security violation or breach, and Complete all periodic mandatory security trainings in accordance with Mastercard’s guidelines. R-249227 Show more Show less
Posted 2 months ago
0 years
0 Lacs
Kochi, Kerala, India
On-site
Job Description: Expert level skills in Java 11+ Spring boot(spring cloud, spring security, jpa) Strong knowledge of RESTful web services High proficiency with development tools and workflows (Junit, Maven, continuous workflow, etc.) Log4J SSO (single sign-on implementation), Maven, Junit, Sonar Experience in SQL, Mongo DB. Good to have: Apache NiFi, groovy scripting Requirements Knowledge of software development lifecycle or IT infrastructure. Good problem-solving and analytical skills. Strong communication and teamwork abilities. Willingness to learn and adapt in a fast-paced environment. Benefits Comprehensive Medical Insurance – Covers the entire family, ensuring health and well-being for employees and their loved ones. Hybrid Work Culture – Only 3 days in-office per week, offering flexibility and reducing commute stress. Healthy Work-Life Balance – Prioritizes employee well-being, allowing time for personal and professional growth. Friendly Working Environment – A supportive and collaborative workplace culture that fosters positivity and teamwork. No Variable Pay Deductions – Guaranteed full compensation with no unpredictable pay cuts, ensuring financial stability. Certification Reimbursement - Company will reimburse the money required to take certificates based on project demands Show more Show less
Posted 2 months ago
10.0 years
0 Lacs
India
On-site
Embark on an exciting journey into the realm of data analytics with 3Pillar! We extend an invitation for you to join our team and gear up for a thrilling adventure. As a Snowflake Data Architect you will be at the heart of the organization and support our clients to take control of their data and get value out of it by defining a reference architecture for our customers. This means that you will work closely with business leaders and information management teams to define and implement a roadmap on data management, business intelligence or analytics solutions.. If your passion for data analytics solutions that make a real-world impact, consider this your pass to the captivating world of Data Science and Engineering! 🌍🔥 Relevant Experience: 10+ years in Data Practice Must have Skills: Snowflake, Data Architecture, Engineering, Governance, and Cloud services Responsibilities Assessments of existing data components, Performing POCs, Consulting to the stakeholders Lead the migration to Snowflake In a strong client-facing role, proposing, advocating, leading implementation of end to end solutions to an enterprise's data specific business problems, and taking care of data collection, extraction, integration, cleansing, enriching and data visualization. Ability to design large data platforms to enable Data Engineers, Analysts & scientists Strong exposure to different Data architectures, data lake & data warehouse, including migrations, rearchitect and platform modernization Define tools & technologies to develop automated data pipelines, write ETL processes, develop dashboard & report and create insights Continually reassess current state for alignment with architecture goals, best practices and business needs DB modeling, deciding best data storage, creating data flow diagrams, maintaining related documentation Taking care of performance, reliability, reusability, resilience, scalability, security, privacy & data governance while designing a data architecture Apply or recommend best practices in architecture, coding, API integration, CI/CD pipelines Coordinate with data scientists, analysts, and other stakeholders for data-related needs Help the Data Science & Analytics Practice grow by mentoring junior Practice members, leading initiatives, leading Data Practice Offerings Provide thought leadership by representing the Practice / Organization on internal / external platforms Qualification: Translate business requirements into data requests, reports and dashboards. Strong Database & modeling concepts with exposure to SQL & NoSQL Databases Strong data architecture patterns & principles, ability to design secure & scalable data lakes, data warehouse, data hubs, and other event-driven architectures Expertise in designing and writing ETL processes. Strong experience to Snowflake, and its components. Knowledge of Master Data management and related tools Strong exposure to data security and privacy regulations (GDPR, HIPAA) and best practices Skilled in ensuring data accuracy, consistency, and quality Experience of AWS services viz., AWS S3, Redshift, Lambda, DynamoDB, EMR, Glue, Lake formation, Athena, Quicksight, RDS, Kinesis, Managed Kafka, API Gateway, CloudWatch AWS S3, Redshift, Lambda, DynamoDB, EMR, Glue, Lake formation, Athena, Quicksight, RDS, Kinesis, Managed Kafka, Elasticsearch and Elastic Cache, API Gateway, CloudWatch Ability to implement data validation processes and establish data quality standards. Experience in Linux, and scripting Proficiency in data visualization tools like Tableau, Power BI or similar to create meaningful insights Additional Experience Desired: Experience working with data ingestion tools such as Fivetran, stitch, or Matillion AWS IOT solutions Apache NiFi, Talend, Informatica Knowledge of GCP Data services Exposure to AI / ML technologies Show more Show less
Posted 2 months ago
4.0 years
0 Lacs
Pune, Maharashtra, India
On-site
Our Purpose Mastercard powers economies and empowers people in 200+ countries and territories worldwide. Together with our customers, we’re helping build a sustainable economy where everyone can prosper. We support a wide range of digital payments choices, making transactions secure, simple, smart and accessible. Our technology and innovation, partnerships and networks combine to deliver a unique set of products and services that help people, businesses and governments realize their greatest potential. Title And Summary Lead Data Engineer – Transfer Solutions Who is Mastercard? Mastercard is a global technology company in the payments industry. We work to connect and power an inclusive, digital economy that benefits everyone, everywhere by making transactions safe, simple, smart, and accessible. Our mission is to connect and power an inclusive, digital economy that benefits everyone, everywhere by making transactions safe, simple, smart, and accessible. Using secure data and networks, partnerships and passion, our innovations and solutions help individuals, financial institutions, governments, and businesses realize their greatest potential. Our decency quotient, or DQ, drives our culture and everything we do inside and outside of our company. With connections across more than 210 countries and territories, we are building a sustainable world that unlocks priceless possibilities for all. Team Overview Transfer Solutions is responsible for driving Mastercard’s expansion into new payment flows such as Disbursements & Remittances. The team is working on creating a market-leading money transfer proposition, Mastercard Move, to power the next generation of payments between people and businesses, whether money is moving domestically or across borders, by delivering the ability to pay and get paid with choice, transparency, and flexibility. The Product & Engineering teams within Transfer Solutions are responsible for designing, developing, launching, and maintaining products and services designed to capture these flows from a wide range of Customer segments. By addressing Customer pain points for domestic and cross-border transfers, the goal is to scale Mastercard’s Disbursements & Remittances business, trebling volume over the next 4 years. If you would like to be part of a global, cross-functional team delivering a highly visible, strategically important initiative in an agile way, this role will be attractive to you. Do you like to be part of a team that is creating and executing strategic initiatives centered around digital payments? Do you look forward to developing and engaging with high performant diverse teams around the globe? Would you like to be part of a highly visible, strategically important global engineering organization? The Role We are looking for an experienced Data Engineer to design and develop advanced data migration pipelines from traditional OLTP databases (e.g., Oracle) to modern big data platforms such as Cloudera and Databricks. The ideal candidate will possess expertise in technologies such as Python, Java, Spark, and NiFi, along with a proven track record in managing data pipelines for tasks including initial snapshot loading, building Change Data Capture (CDC) pipelines, exception management, reconciliation, data security, and retention. This role also demands proficiency in data modeling, cataloging, taxonomy creation, and ensuring robust data provenance and lineage to support governance and compliance requirements. Key Responsibilities Design, develop, and optimize data migration pipelines from OLTP databases like Oracle to big data platforms, including Cloudera CDP/CDH and Databricks. Build scalable ETL workflows using tools like Python, Scala, Apache Spark, and Apache NiFi to support initial snapshots, CDC, exception handling, and reconciliation processes. Implement data security measures, such as encryption, access controls, and compliance with data retention policies, across all migration pipelines. Develop and maintain data models, taxonomy structures, and cataloging systems to ensure logical organization and easy accessibility of data. Establish data lineage and provenance to ensure traceability and compliance with governance frameworks. Collaborate with cross-functional teams to understand data migration requirements, ensuring high-quality and timely delivery of solutions. Monitor and troubleshoot data pipelines to ensure performance, scalability, and reliability. Stay updated on emerging technologies in data engineering and big data ecosystems, proposing improvements to existing systems and processes. Required Skills And Qualifications 10+ years of experience in data engineering, with at least 2 years in a leadership or technical lead role. Proficiency in OLTP databases, particularly Oracle, and data egress techniques. Strong programming skills in Python, Scala and Java. Expertise in Apache Spark, Flink, Kafka and data integration tools like Apache NiFi. Hands-on experience with Cloudera Data Platform CDP/CDH, Apache Ozone Familiarity with cloud-based big data ecosystems such as AWS Databrick, S3, Glue etc Familiarity with patterns such as Medallion, data layers, datalake, datawarehouse, experience in building scalable ETL pipeline, optimizing data workflows, leveraging platforms to integrate transform, and store large datasets. Knowledge of data security best practices, including encryption, data masking, and role-based access control. Exceptional problem-solving and analytical abilities Strong communication and leadership skills, with the ability to navigate ambiguity and collaborate effectively across diverse teams.Optional – Awareness on regulatory compliance requirements for data handling and privacy Education: Bachelor’s or Master’s degree in Computer Science Corporate Security Responsibility All activities involving access to Mastercard assets, information, and networks comes with an inherent risk to the organization and, therefore, it is expected that every person working for, or on behalf of, Mastercard is responsible for information security and must: Abide by Mastercard’s security policies and practices; Ensure the confidentiality and integrity of the information being accessed; Report any suspected information security violation or breach, and Complete all periodic mandatory security trainings in accordance with Mastercard’s guidelines. R-233628 Show more Show less
Posted 2 months ago
0 years
0 Lacs
Gurgaon, Haryana, India
On-site
Our Purpose Mastercard powers economies and empowers people in 200+ countries and territories worldwide. Together with our customers, we’re helping build a sustainable economy where everyone can prosper. We support a wide range of digital payments choices, making transactions secure, simple, smart and accessible. Our technology and innovation, partnerships and networks combine to deliver a unique set of products and services that help people, businesses and governments realize their greatest potential. Title And Summary Senior Analyst, Data Strategy Overview The Data Quality team in the Data Strategy & Management organization (Chief Data Office) is responsible for developing and driving Mastercard’s core data management program and broadening its data quality efforts. This team ensures that Mastercard maintains and increases the value of Mastercard’s data assets as we enable new forms of payment, new players in the ecosystem and expand collection and use of data to support new lines of business. Enterprise Data Quality team for is responsible for ensuring data is of the highest quality and is fit for purpose to support business and analytic uses. The team works to identify and prioritize opportunities for data quality improvement, develops strategic mitigation plans and coordinates remediation activities with MC Tech and the business owner. Role Support the processes for improving and expanding merchant data, including address standardization, geocoding, incorporating new merchant data sources to assist in improvements. Assess quality issues in merchant data, transactional data, and other critical data sets. Support internal and external feedback loop to improve data submitted through the transaction data stream. Solution and present data challenges in a manner suitable for product and business understanding. Provide subject matter expertise on merchant data for the organization’s product development efforts Coordinate with MC Tech to develop DQ remediation requirements for core systems, data warehouse and other critical applications. Manage corresponding remediation projects to ensure successful implementation. Lead organization-wide awareness and communication of data quality initiatives and remediation activities, ensuring seamless implementation. Coordinate with critical vendors to manage project timelines and achieve quality deliverables Develop and implement with MC Tech data pipelines that extracts, transforms, and loads data into an information product that supports organizational strategic goals Implement new technologies and frameworks as per project requirements All About You Hands-on experience managing technology projects with demonstrated ability to understand complex data and technology initiatives Ability to lead and influence others to advance deliverables Understanding of emerging technologies including but not limited to, cloud architecture, machine learning/AI and Big Data infrastructure Data architecture experience and experience in building data models. Experience deploying and working with big data technologies like Hadoop, Spark, and Sqoop. Experience with streaming frameworks like Kafka and Axon and pipelines like Nifi, Proficient in OO programming (Python Java/Springboot/J2EE, and Scala) Experience with the Hadoop Ecosystem (HDFS, Yarn, MapReduce, Spark, Hive, Impala), Experience with Linux, Unix command line, Unix Shell Scripting, SQL and any Scripting language Experience with data visualization tools such as Tableau, Domo, and/or PowerBI is a plus. Experience presenting data findings in a readable and insight driven format. Experience building support decks. Corporate Security Responsibility All activities involving access to Mastercard assets, information, and networks comes with an inherent risk to the organization and, therefore, it is expected that every person working for, or on behalf of, Mastercard is responsible for information security and must: Abide by Mastercard’s security policies and practices; Ensure the confidentiality and integrity of the information being accessed; Report any suspected information security violation or breach, and Complete all periodic mandatory security trainings in accordance with Mastercard’s guidelines. R-249888 Show more Show less
Posted 2 months ago
4.0 years
0 Lacs
Pune, Maharashtra, India
On-site
Our Purpose Mastercard powers economies and empowers people in 200+ countries and territories worldwide. Together with our customers, we’re helping build a sustainable economy where everyone can prosper. We support a wide range of digital payments choices, making transactions secure, simple, smart and accessible. Our technology and innovation, partnerships and networks combine to deliver a unique set of products and services that help people, businesses and governments realize their greatest potential. Title And Summary Data Engineer-2 Overview We are the global technology company behind the world’s fastest payments processing network. We are a vehicle for commerce, a connection to financial systems for the previously excluded, a technology innovation lab, and the home of Priceless®. We ensure every employee has the opportunity to be a part of something bigger and to change lives. We believe as our company grows, so should you. We believe in connecting everyone to endless, priceless possibilities. Our Team Within Mastercard – Services The Services org is a key differentiator for Mastercard, providing the cutting-edge services that are used by some of the world's largest organizations to make multi-million dollar decisions and grow their businesses. Focused on thinking big and scaling fast around the globe, this agile team is responsible for end-to-end solutions for a diverse global customer base. Centered on data-driven technologies and innovation, these services include payments-focused consulting, loyalty and marketing programs, business Test & Learn experimentation, and data-driven information and risk management services. Data Analytics And AI Solutions (DAAI) Program Within the D&S Technology Team, the DAAI program is a relatively new program that is comprised of a rich set of products that provide accurate perspectives on Portfolio Optimization, and Ad Insights. Currently, we are enhancing our customer experience with new user interfaces, moving to API and web application-based data publishing to allow for seamless integration in other Mastercard products and externally, utilizing new data sets and algorithms to further analytic capabilities, and generating scalable big data processes. We are looking for an innovative software engineering lead who will lead the technical design and development of an Analytic Foundation. The Analytic Foundation is a suite of individually commercialized analytical capabilities (think prediction as a service, matching as a service or forecasting as a service) that also includes a comprehensive data platform. These services will be offered through a series of APIs that deliver data and insights from various points along a central data store. This individual will partner closely with other areas of the business to build and enhance solutions that drive value for our customers. Engineers work in small, flexible teams. Every team member contributes to designing, building, and testing features. The range of work you will encounter varies from building intuitive, responsive UIs to designing backend data models, architecting data flows, and beyond. There are no rigid organizational structures, and each team uses processes that work best for its members and projects. Here are a few examples of products in our space: Portfolio Optimizer (PO) is a solution that leverages Mastercard’s data assets and analytics to allow issuers to identify and increase revenue opportunities within their credit and debit portfolios. Ad Insights uses anonymized and aggregated transaction insights to offer targeting segments that have high likelihood to make purchases within a category to allow for more effective campaign planning and activation. Help found a new, fast-growing engineering team! Position Responsibilities As a Data Engineer within DAAI, you will: Play a large role in the implementation of complex features Push the boundaries of analytics and powerful, scalable applications Build and maintain analytics and data models to enable performant and scalable products Ensure a high-quality code base by writing and reviewing performant, well-tested code Mentor junior engineers and teammates Drive innovative improvements to team development processes Partner with Product Managers and Customer Experience Designers to develop a deep understanding of users and use cases and apply that knowledge to scoping and building new modules and features Collaborate across teams with exceptional peers who are passionate about what they do Ideal Candidate Qualifications 4+ years of data engineering experience in an agile production environment Experience leading the design and implementation of large, complex features in full-stack applications Ability to easily move between business, data management, and technical teams; ability to quickly intuit the business use case and identify technical solutions to enable it Experience leveraging open source tools, predictive analytics, machine learning, Advanced Statistics, and other data techniques to perform analyses High proficiency in using Python or Scala, Spark, Hadoop platforms & tools (Hive, Impala, Airflow, NiFi, Scoop), SQL to build Big Data products & platforms Experience in building and deploying production-level data-driven applications and data processing workflows/pipelines and/or implementing machine learning systems at scale in Java, Scala, or Python and deliver analytics involving all phases like data ingestion, feature engineering, modeling, tuning, evaluating, monitoring, and presenting Experience in cloud technologies like Databricks/AWS/Azure Strong technologist with proven track record of learning new technologies and frameworks Customer-centric development approach Passion for analytical / quantitative problem solving Experience identifying and implementing technical improvements to development processes Collaboration skills with experience working with people across roles and geographies Motivation, creativity, self-direction, and desire to thrive on small project teams Superior academic record with a degree in Computer Science or related technical field Strong written and verbal English communication skills Corporate Security Responsibility All activities involving access to Mastercard assets, information, and networks comes with an inherent risk to the organization and, therefore, it is expected that every person working for, or on behalf of, Mastercard is responsible for information security and must: Abide by Mastercard’s security policies and practices; Ensure the confidentiality and integrity of the information being accessed; Report any suspected information security violation or breach, and Complete all periodic mandatory security trainings in accordance with Mastercard’s guidelines. R-246504 Show more Show less
Posted 2 months ago
3.0 - 8.0 years
0 Lacs
Bengaluru, Karnataka, India
On-site
About Us All people need connectivity. The Rakuten Group is reinventing telecom by greatly reducing cost, rewarding big users not penalizing them, empowering more people and leading the human centric AI future. The mission is to connect everybody and enable all to be. Rakuten. Telecom Invented. Job Description Job Title - Data Engineer Location - Bangalore (Onsite) Why should you choose us? Rakuten Symphony is a Rakuten Group company, that provides global B2B services for the mobile telco industry and enables next-generation, cloud-based, international mobile services. Building on the technology Rakuten used to launch Japan’s newest mobile network, we are taking our mobile offering global. To support our ambitions to provide an innovative cloud-native telco platform for our customers, Rakuten Symphony is looking to recruit and develop top talent from around the globe. We are looking for individuals to join our team across all functional areas of our business – from sales to engineering, support functions to product development. Let’s build the future of mobile telecommunications together! What will you do? Our Data Platform team is building a world-class autonomous data service platform to cater services such as data lake as a service, database as a service, data transformation as a service and AI services. We are looking for a Data Engineer to help us build functional systems for our data platform services. For our autonomous data platform services as a Data Engineer you will be responsible for the end-to-end research and development of relevant features for the platform following the product lifecycle of the offerings. If you have in-depth knowledge in Spark , NiFi and other distributed systems, we’d like to meet you. Ultimately, you will develop various self-managed and scalable services of the data platform which will be offered as cloud services to the end users. Roles And Responsibilities Work experience as a Data Engineer or similar software engineering role (3-8 years) Good knowledge of NiFi, Spark and distributed eco systems, knowledge of Kubernetes application development, how to make K8 centric applications is a plus Expertise in development using Java/Scala – Not python Must be able to quickly design and implement tolerant and highly available pipelines using distributed eco systems and should have hands-on experience with any NoSQL DB, Presto/Trino, NiFi, and Airflow – Casandra Sound knowledge of Spring frame work and spring frame work related solutions for web application development Problem-solving attitude, tinkering approach, and creative thinking Our Commitment To You Rakuten Group’s mission is to contribute to society by creating value through innovation and entrepreneurship. By providing high-quality services that help our users and partners grow, We aim to advance and enrich society. To fulfill our role as a Global Innovation Company, we are committed to maximizing both corporate and shareholder value. Job Requirement Responsibilities Research and comparative analysis System architecture design Implement integrations Deploy updates and fixes Perform root cause analysis for production errors Investigate and resolve technical issues Architecture and support documentation Requirements Work experience as a Data Engineer or similar software engineering role (3-8 years) Good knowledge of NiFi, Spark and distributed eco systems, knowledge of Kubernetes application development, how to make K8 centric applications is a plus Expertise in development using Java/Scala – Not python Must be able to quickly design and implement tolerant and highly available pipelines using distributed eco systems and should have hands-on experience with any NoSQL DB, Presto/Trino, NiFi, and Airflow – Casandra Sound knowledge of Spring frame work and spring frame work related solutions for web application development Problem-solving attitude, tinkering approach, and creative thinking Show more Show less
Posted 2 months ago
6.0 years
0 Lacs
Gurugram, Haryana, India
On-site
About The Role Grade Level (for internal use): 10 Position Title : Senior Software Developer The Team: Do you love to collaborate & provide solutions? This team comes together across eight different locations every single day to craft enterprise grade applications that serve a large customer base with growing demand and usage. You will use a wide range of technologies and cultivate a collaborative environment with other internal teams. The Impact: We focus primarily developing, enhancing and delivering required pieces of information & functionality to internal & external clients in all client-facing applications. You will have a highly visible role where even small changes have very wide impact. What’s in it for you? Opportunities for innovation and learning new state of the art technologies To work in pure agile & scrum methodology Responsibilities Design, and implement software-related projects. Perform analyses and articulate solutions. Design underlying engineering for use in multiple product offerings supporting a large volume of end-users. Develop project plans with task breakdowns and estimates. Manage and improve existing solutions. Solve a variety of complex problems and figure out possible solutions, weighing the costs and benefits. Basic Qualifications What we’re Looking For : Bachelor's degree in Computer Science or Equivalent 6+ years’ related experience Passionate, smart, and articulate developer Strong C#, WPF and SQL skills Experience implementing: Web Services (with WCF, RESTful JSON, SOAP, TCP), Windows Services, and Unit Tests Dependency Injection Able to demonstrate strong OOP skills Able to work well individually and with a team Strong problem-solving skills Good work ethic, self-starter, and results-oriented Interest and experience in Environmental and Sustainability content is a plus Agile/Scrum experience a plus Exposure to Data Engineering & Big Data technologies like Hadoop, Spark/Scala, Nifi & ETL is a plus Preferred Qualifications Experience on Docker is a plus Experience working in cloud computing environments such as AWS, Azure or GCP Experience with large scale messaging systems such as Kafka or RabbitMQ or commercial systems. What’s In It For You? Our Purpose Progress is not a self-starter. It requires a catalyst to be set in motion. Information, imagination, people, technology–the right combination can unlock possibility and change the world. Our world is in transition and getting more complex by the day. We push past expected observations and seek out new levels of understanding so that we can help companies, governments and individuals make an impact on tomorrow. At S&P Global we transform data into Essential Intelligence®, pinpointing risks and opening possibilities. We Accelerate Progress. Our People We're more than 35,000 strong worldwide—so we're able to understand nuances while having a broad perspective. Our team is driven by curiosity and a shared belief that Essential Intelligence can help build a more prosperous future for us all. From finding new ways to measure sustainability to analyzing energy transition across the supply chain to building workflow solutions that make it easy to tap into insight and apply it. We are changing the way people see things and empowering them to make an impact on the world we live in. We’re committed to a more equitable future and to helping our customers find new, sustainable ways of doing business. We’re constantly seeking new solutions that have progress in mind. Join us and help create the critical insights that truly make a difference. Our Values Integrity, Discovery, Partnership At S&P Global, we focus on Powering Global Markets. Throughout our history, the world's leading organizations have relied on us for the Essential Intelligence they need to make confident decisions about the road ahead. We start with a foundation of integrity in all we do, bring a spirit of discovery to our work, and collaborate in close partnership with each other and our customers to achieve shared goals. Benefits We take care of you, so you can take care of business. We care about our people. That’s why we provide everything you—and your career—need to thrive at S&P Global. Our Benefits Include Health & Wellness: Health care coverage designed for the mind and body. Flexible Downtime: Generous time off helps keep you energized for your time on. Continuous Learning: Access a wealth of resources to grow your career and learn valuable new skills. Invest in Your Future: Secure your financial future through competitive pay, retirement planning, a continuing education program with a company-matched student loan contribution, and financial wellness programs. Family Friendly Perks: It’s not just about you. S&P Global has perks for your partners and little ones, too, with some best-in class benefits for families. Beyond the Basics: From retail discounts to referral incentive awards—small perks can make a big difference. For more information on benefits by country visit: https://spgbenefits.com/benefit-summaries Global Hiring And Opportunity At S&P Global At S&P Global, we are committed to fostering a connected and engaged workplace where all individuals have access to opportunities based on their skills, experience, and contributions. Our hiring practices emphasize fairness, transparency, and merit, ensuring that we attract and retain top talent. By valuing different perspectives and promoting a culture of respect and collaboration, we drive innovation and power global markets. Equal Opportunity Employer S&P Global is an equal opportunity employer and all qualified candidates will receive consideration for employment without regard to race/ethnicity, color, religion, sex, sexual orientation, gender identity, national origin, age, disability, marital status, military veteran status, unemployment status, or any other status protected by law. Only electronic job submissions will be considered for employment. If you need an accommodation during the application process due to a disability, please send an email to: EEO.Compliance@spglobal.com and your request will be forwarded to the appropriate person. US Candidates Only: The EEO is the Law Poster http://www.dol.gov/ofccp/regs/compliance/posters/pdf/eeopost.pdf describes discrimination protections under federal law. Pay Transparency Nondiscrimination Provision - https://www.dol.gov/sites/dolgov/files/ofccp/pdf/pay-transp_%20English_formattedESQA508c.pdf 20 - Professional (EEO-2 Job Categories-United States of America), IFTECH202.1 - Middle Professional Tier I (EEO Job Group), SWP Priority – Ratings - (Strategic Workforce Planning) Job ID: 315230 Posted On: 2025-05-05 Location: Gurgaon, Haryana, India Show more Show less
Posted 2 months ago
6.0 years
0 Lacs
Hyderabad, Telangana, India
On-site
About The Role Grade Level (for internal use): 10 Position Title : Senior Software Developer The Team: Do you love to collaborate & provide solutions? This team comes together across eight different locations every single day to craft enterprise grade applications that serve a large customer base with growing demand and usage. You will use a wide range of technologies and cultivate a collaborative environment with other internal teams. The Impact: We focus primarily developing, enhancing and delivering required pieces of information & functionality to internal & external clients in all client-facing applications. You will have a highly visible role where even small changes have very wide impact. What’s in it for you? Opportunities for innovation and learning new state of the art technologies To work in pure agile & scrum methodology Responsibilities Design, and implement software-related projects. Perform analyses and articulate solutions. Design underlying engineering for use in multiple product offerings supporting a large volume of end-users. Develop project plans with task breakdowns and estimates. Manage and improve existing solutions. Solve a variety of complex problems and figure out possible solutions, weighing the costs and benefits. Basic Qualifications What we’re Looking For : Bachelor's degree in Computer Science or Equivalent 6+ years’ related experience Passionate, smart, and articulate developer Strong C#, WPF and SQL skills Experience implementing: Web Services (with WCF, RESTful JSON, SOAP, TCP), Windows Services, and Unit Tests Dependency Injection Able to demonstrate strong OOP skills Able to work well individually and with a team Strong problem-solving skills Good work ethic, self-starter, and results-oriented Interest and experience in Environmental and Sustainability content is a plus Agile/Scrum experience a plus Exposure to Data Engineering & Big Data technologies like Hadoop, Spark/Scala, Nifi & ETL is a plus Preferred Qualifications Experience on Docker is a plus Experience working in cloud computing environments such as AWS, Azure or GCP Experience with large scale messaging systems such as Kafka or RabbitMQ or commercial systems. What’s In It For You? Our Purpose Progress is not a self-starter. It requires a catalyst to be set in motion. Information, imagination, people, technology–the right combination can unlock possibility and change the world. Our world is in transition and getting more complex by the day. We push past expected observations and seek out new levels of understanding so that we can help companies, governments and individuals make an impact on tomorrow. At S&P Global we transform data into Essential Intelligence®, pinpointing risks and opening possibilities. We Accelerate Progress. Our People We're more than 35,000 strong worldwide—so we're able to understand nuances while having a broad perspective. Our team is driven by curiosity and a shared belief that Essential Intelligence can help build a more prosperous future for us all. From finding new ways to measure sustainability to analyzing energy transition across the supply chain to building workflow solutions that make it easy to tap into insight and apply it. We are changing the way people see things and empowering them to make an impact on the world we live in. We’re committed to a more equitable future and to helping our customers find new, sustainable ways of doing business. We’re constantly seeking new solutions that have progress in mind. Join us and help create the critical insights that truly make a difference. Our Values Integrity, Discovery, Partnership At S&P Global, we focus on Powering Global Markets. Throughout our history, the world's leading organizations have relied on us for the Essential Intelligence they need to make confident decisions about the road ahead. We start with a foundation of integrity in all we do, bring a spirit of discovery to our work, and collaborate in close partnership with each other and our customers to achieve shared goals. Benefits We take care of you, so you can take care of business. We care about our people. That’s why we provide everything you—and your career—need to thrive at S&P Global. Our Benefits Include Health & Wellness: Health care coverage designed for the mind and body. Flexible Downtime: Generous time off helps keep you energized for your time on. Continuous Learning: Access a wealth of resources to grow your career and learn valuable new skills. Invest in Your Future: Secure your financial future through competitive pay, retirement planning, a continuing education program with a company-matched student loan contribution, and financial wellness programs. Family Friendly Perks: It’s not just about you. S&P Global has perks for your partners and little ones, too, with some best-in class benefits for families. Beyond the Basics: From retail discounts to referral incentive awards—small perks can make a big difference. For more information on benefits by country visit: https://spgbenefits.com/benefit-summaries Global Hiring And Opportunity At S&P Global At S&P Global, we are committed to fostering a connected and engaged workplace where all individuals have access to opportunities based on their skills, experience, and contributions. Our hiring practices emphasize fairness, transparency, and merit, ensuring that we attract and retain top talent. By valuing different perspectives and promoting a culture of respect and collaboration, we drive innovation and power global markets. Equal Opportunity Employer S&P Global is an equal opportunity employer and all qualified candidates will receive consideration for employment without regard to race/ethnicity, color, religion, sex, sexual orientation, gender identity, national origin, age, disability, marital status, military veteran status, unemployment status, or any other status protected by law. Only electronic job submissions will be considered for employment. If you need an accommodation during the application process due to a disability, please send an email to: EEO.Compliance@spglobal.com and your request will be forwarded to the appropriate person. US Candidates Only: The EEO is the Law Poster http://www.dol.gov/ofccp/regs/compliance/posters/pdf/eeopost.pdf describes discrimination protections under federal law. Pay Transparency Nondiscrimination Provision - https://www.dol.gov/sites/dolgov/files/ofccp/pdf/pay-transp_%20English_formattedESQA508c.pdf 20 - Professional (EEO-2 Job Categories-United States of America), IFTECH202.1 - Middle Professional Tier I (EEO Job Group), SWP Priority – Ratings - (Strategic Workforce Planning) Job ID: 315230 Posted On: 2025-05-05 Location: Gurgaon, Haryana, India Show more Show less
Posted 2 months ago
6.0 years
0 Lacs
Hyderabad, Telangana, India
On-site
What is Blend Blend is a premier AI services provider, committed to co-creating meaningful impact for its clients through the power of data science, AI, technology, and people. With a mission to fuel bold visions, Blend tackles significant challenges by seamlessly aligning human expertise with artificial intelligence. The company is dedicated to unlocking value and fostering innovation for its clients by harnessing world-class people and data-driven strategy. We believe that the power of people and AI can have a meaningful impact on your world, creating more fulfilling work and projects for our people and clients. For more information, visit www.blend360.com What is the Role? We are seeking a highly skilled Lead Data Engineer to join our data engineering team for an on-premise environment. A large portion of your time will be in the weeds working alongside your team architecture, designing, implementing, and optimizing data solutions. The ideal candidate will have extensive experience in building and optimizing data pipelines, architectures, and data sets, with a strong focus on Python, SQL, Hadoop, HDFS, and Apache NiFi. What you’ll be doing? Design, develop, and maintain robust, scalable, and high-performance data pipelines and data integration solutions. Manage and optimize data storage in Hadoop Distributed File System (HDFS). Design and implement data workflows using Apache NiFi for data ingestion, transformation, and distribution. Collaborate with cross-functional teams to understand data requirements and deliver efficient solutions. Ensure data quality, governance, and security standards are met within the on-premise infrastructure. Monitor and troubleshoot data pipelines to ensure optimal performance and reliability. Automate data workflows and processes to enhance system efficiency. What do we need from you? Bachelor’s degree in computer science, Software Engineering, or a related field 6+ years of experience in data engineering or a related field Strong programming skills in Python and SQL. Hands-on experience with Hadoop ecosystem (HDFS, Hive, etc.). Proficiency in Apache NiFi for data ingestion and flow orchestration. Experience in data modeling, ETL development, and data warehousing concepts. Strong problem-solving skills and ability to work independently in a fast-paced environment. Good understanding of data governance, data security, and best practices in on-premise environments. What do you get in return? Competitive Salary: Your skills and contributions are highly valued here, and we make sure your salary reflects that, rewarding you fairly for the knowledge and experience you bring to the table. Dynamic Career Growth: Our vibrant environment offers you the opportunity to grow rapidly, providing the right tools, mentorship, and experiences to fast-track your career. Idea Tanks : Innovation lives here. Our "Idea Tanks" are your playground to pitch, experiment, and collaborate on ideas that can shape the future. Growth Chats : Dive into our casual "Growth Chats" where you can learn from the best whether it's over lunch or during a laid-back session with peers, it's the perfect space to grow your skills. Snack Zone: Stay fueled and inspired! In our Snack Zone, you'll find a variety of snacks to keep your energy high and ideas flowing. Recognition & Rewards : We believe great work deserves to be recognized. Expect regular Hive-Fives, shoutouts and the chance to see your ideas come to life as part of our reward program. Fuel Your Growth Journey with Certifications: We’re all about your growth groove! Level up your skills with our support as we cover the cost of your certifications . Show more Show less
Posted 2 months ago
10.0 years
0 Lacs
Pune, Maharashtra, India
On-site
Experience: 10+ years Location: BKC, Mumbai Responsibilities: Evangelize, motivate, and enable our customers on their Enterprise Data Cloud journey Participate in the pre and post sales process, helping both the sales, professional services management and product teams to interpret customer use cases Use your deep domain, wide technical and business knowledge to assist customers in defining their data strategy, use case success criteria and frameworks to deliver successful implementations Design and Implement Enterprise Data Cloud architectures and configurations for customers Identify and grow professional services engagements and support subscriptions through the clear demonstration of the value we bring to our customers Design, create and recommend standard best practices design patterns for distributed data pipelines and analytical computing architectures Plan and deliver presentations and workshops to customer/internal stakeholders Write and produce technical documentation, blogs, and knowledge base articles Skills and Experience: The candidate must be having 10+ years of experience Extensive customer facing/consulting experience interacting with large scale distributed data/computing solutions A strong business understanding of how Cloudera technologies solve real world business problems Appreciation of the commercial business cases that drive customer data platform initiatives Experience managing project delivery and leading a technical team Strong experience designing, architecting, and implementing software solutions in an enterprise Linux environment, including solid foundation in OS / networking fundamentals Strong experience with Hadoop or related technologies including deployment & administration Excellent communication skills, experience with public speaking and able to present to a wide range of audiences Proven knowledge of big data/analytical use cases and best practice approaches to implement solutions to them Strong experience with Cloud Platforms (i.e. AWS, Azure, Google Cloud) Experience with open-source ecosystem programming languages (i.e. Python, Java, Scala, Spark etc.) Knowledge of the data management ecosystem including Concepts of data warehousing, ETL, data integration, etc. Experience designing frameworks for implementing data transformation and processing solutions on Hadoop or related technologies (i.e. HDFS, HIVE, Impala, HBase, NiFi etc.) Strong understanding of authentication (i.e. LDAP , Active Directory, SAML, Kerberos etc.) & authorization confi guration for Hadoop based distributed systems Deep knowledge of the Data/Big Data business domain Familiarity with BI tools and Data Science notebooks such as Cloudera Data Science Workbench, Apache Zeppelin, Jupyter, IBM Watson Studio etc. Knowledge of scripting tools such as bash shell scripts, Python or Perl Familiarity of DevOps methodology & toolsets and automation experience with Chef, Puppet, Ansible or Jenkins Ability to travel ~70% Show more Show less
Posted 2 months ago
7.0 years
0 Lacs
Gurugram, Haryana, India
On-site
Principal Lead- Backend (Java) Company Overview: Birdeye Inc. is a global leader in customer experience management, empowering businesses with innovative solutions to enhance customer interactions and drive growth. Job Description: We are seeking a Principal lead with strong technical expertise in Java, Spring framework, Kafka, MySQL, and NoSQL databases . The ideal candidate will be responsible for defining system architecture, ensuring scalability, and driving technical innovation. This role requires hands-on experience in designing large-scale backend systems, optimizing performance, and mentoring engineering teams. Responsibilities: Define system architecture and ensure scalability, security, and performance of backend services. Lead end-to-end execution of complex projects , from design to deployment. Drive PoCs and evaluate new technologies for improving backend efficiency and innovation. Optimize system efficiency, troubleshoot critical production issues, and improve system reliability. Architect and implement scalable solutions for real-time data processing using Kafka and NiFi. Design, optimize, and manage large-scale data storage with MySQL and NoSQL databases. Collaborate with product managers, architects, and other teams to align technical solutions with business goals. Provide technical leadership, mentorship, and coaching to engineering teams. Enforce best practices in coding, performance optimization, and security. Requirements: Bachelor’s degree in Computer Science, Engineering, or related field (or equivalent experience). 7+ years of experience in software engineering, with a strong focus on backend development. Expertise in Java, Spring framework , and designing high-performance, scalable systems. Hands-on experience with Kafka and NiFi for event-driven architectures and large-scale data processing. Deep understanding of MySQL and NoSQL databases , including data optimization and management. Strong leadership and technical decision-making skills, with experience mentoring engineers. This role is ideal for someone who thrives on solving complex technical challenges, driving innovation, and leading high-performing engineering teams. Interested candidates, please send their resumes to iqbal.kaur@birdeye.com Regards Iqbal kaur Show more Show less
Posted 2 months ago
0 years
0 Lacs
Pune, Maharashtra, India
On-site
About _VOIS: _VO IS (Vodafone Intelligent Solutions) is a strategic arm of Vodafone Group Plc, creating value and enhancing quality and efficiency across 28 countries, and operating from 7 locations: Albania, Egypt, Hungary, India, Romania, Spain and the UK. Over 29,000 highly skilled individuals are dedicated to being Vodafone Group’s partner of choice for talent, technology, and transformation. We deliver the best services across IT, Business Intelligence Services, Customer Operations, Business Operations, HR, Finance, Supply Chain, HR Operations, and many more. Established in 2006, _VO IS has evolved into a global, multi-functional organisation, a Centre of Excellence for Intelligent Solutions focused on adding value and delivering business outcomes for Vodafone. _VOIS Centre Intro About _VOIS India: In 2009, _VO IS started operating in India and now has established global delivery centres in Pune, Bangalore and Ahmedabad. With more than 14,500 employees, _VO IS India supports global markets and group functions of Vodafone, and delivers best-in-class customer experience through multi-functional services in the areas of Information Technology, Networks, Business Intelligence and Analytics, Digital Business Solutions (Robotics & AI), Commercial Operations (Consumer & Business), Intelligent Operations, Finance Operations, Supply Chain Operations and HR Operations and more. Job Role Related Content (Role specific) Experience managing the development lifecycle for agile software development projects Expert level experience in designing, building and managing data pipelines for batch and streaming applications Experience with performance tuning for batch-based applications like Hadoop, including working knowledge using Nifi, Yarn, Hive, Airflow and Spark Experience with performance tuning streaming-based applications for real-time data processing using Kafka, Confluent Kafka, AWS Kinesis, GCP pub/sub or similar Experience working with serverless services such as Openshift, GCP or AWS Working experience with AWS would be a prerequisite Working experience with other distributed technologies such as CassandraDB, DynamoDB, MongoDB, Elastic Search and Flink would be desirable _VOIS Equal Opportunity Employer Commitment India _VO IS is proud to be an Equal Employment Opportunity Employer. We celebrate differences and we welcome and value diverse people and insights. We believe that being authentically human and inclusive powers our employees’ growth and enables them to create a positive impact on themselves and society. We do not discriminate based on age, colour, gender (including pregnancy, childbirth, or related medical conditions), gender identity, gender expression, national origin, race, religion, sexual orientation, status as an individual with a disability, or other applicable legally protected characteristics. As a result of living and breathing our commitment, our employees have helped us get certified as a Great Place to Work in India for four years running. We have been also highlighted among the Top 5 Best Workplaces for Diversity, Equity, and Inclusion , Top 10 Best Workplaces for Women , Top 25 Best Workplaces in IT & IT-BPM and 14th Overall Best Workplaces in India by the Great Place to Work Institute in 2023. These achievements position us among a select group of trustworthy and high-performing companies which put their employees at the heart of everything they do. By joining us, you are part of our commitment. We look forward to welcoming you into our family which represents a variety of cultures, backgrounds, perspectives, and skills! Apply now, and we’ll be in touch! Show more Show less
Posted 2 months ago
8.0 years
0 Lacs
Hyderabad, Telangana, India
On-site
Job Description Senior Associate / Manager ? Nifi Developer Job Location: Pan India Candidate should possess 8 to 12years of experience where 3+ years? should be relevant. Roles & Responsibilities: ? Design, develop, and manage data pipelines using Apache NiFi ? Integrate with systems like Kafka, HDFS, Hive, Spark, and RDBMS ? Monitor, troubleshoot, and optimize data flows ? Ensure data quality, reliability, and security ? Work with cross-functional teams to gather requirements and deliver data solutions Skills Required: ? Strong hands-on experience with Apache NiFi ? Knowledge of data ingestion, streaming, and batch processing ? Experience with Linux, shell scripting, and cloud environments (AWS/GCP is a plus) ? Familiarity with REST APIs, JSON/XML, and data transformation Skills Required RoleNifi Developer - SA/M Industry TypeIT/ Computers - Software Functional AreaIT-Software Required EducationAny Graduates Employment TypeFull Time, Permanent Key Skills NIFI DEVELOPER DESIGN DEVELOPMENT Other Information Job CodeGO/JC/21435/2025 Recruiter NameSPriya Show more Show less
Posted 2 months ago
5.0 years
0 Lacs
Noida, Uttar Pradesh, India
On-site
Requirements Description and Requirements Join our team and what we’ll accomplish together The Wireless Core Network Development team is responsible for End to End network architecture, development, and operations including service orchestration and automation. The team designs, develops, maintains, and supports our Core Wireless Network and all its services specific to our customer data. We work as a team to introduce the latest technology and software to enable network orchestration and automation in the fast evolving 5G ecosystem, and propel TELUS’ digital transformation. Our team creates previously impossible solutions by leveraging the new approach to enable our customers unique and rich wireless experiences. These innovative solutions will improve the life quality of thousands while revolutionizing how everything and everyone connects. You will own the customer experience by providing strategy, managing change and leveraging best in class security and AI to deliver reliable products to our customers. This will represent a fundamental change on how the Telecom industry works, opening the possibility of making private cellular networks globally available, sparking innovation and enabling access to the digital world to more people by providing never seen reliability at reduced costs. What you'll do Overall responsibility for the architecture, design and operational support of TELUS subscriber database solutions (HLR, HSS, EIR, IMEIDB, UDM, UDR); This includes but is not limited to understanding fully how the current network is architected & identifying areas of improvement/modernization that we need to undertake driving reliability and efficiency in the support of the solution Help us design, develop, and implement software solutions supporting the subscriber data platforms within the 5G core architecture.. This will include management, assurance and closed-loop of the UDM, AUSF and SDL which will reside on a cloud native services Bring your ideas, bring your coding skills, and bring your passion to learn Identify E2E network control signaling and roaming gap, available and ongoing design, together with architecting future-friendly solutions as technology evolves Collaborate with cross functional teams from Radio, Core, Transport, Infrastructure, Business and assurance domain, define migration strategies for moving services to cloud. Bring your experience in Open API, security, configuration, data model management and processing Node JS, and learn or bring your experience in other languages like RESTful, JSON, NETCONF, Apache Nifi, Kafka, SNMP, Java, Bash, Python, HTTPS, SSH TypeScript and Python Maintain/develop Network Architecture/Design document Additional Job Description What you bring: 5+ years of telecommunication experience Experienced in adapter API design using RESTful, NETCONF, interested in developing back-end software Proven knowledge of technologies such as Service Based Architecture (SBA), Subscriber Data Management functions, http2, Diameter, Sigtran, SS7, and 5G Protocol General understanding of TCP/IP networking and familiarity with TCP, UDP, SS7 RADIUS, and Diameter protocols along with SOAP/REST API working principles Proven understanding of IPSEC, TLS 1.2, 1.3 and understanding of OAUTH 2.0 framework 2 + years’ experience as a software developer, advanced technical and analytical skills, and the ability to take responsibility for the overall technical direction of the project Experience with Public Cloud Native Services like Openshift, AWS, GCP or Azure Expert knowledge in Database redundancy, replication, Synchronization Knowledge of different database concepts (relational vs non-relational DB) Subject Matter Expert in implementing, integrating, and deploying solutions related to subscriber data management (HLR, HSS, EIR, IMEIDB, UDM, UDR,F5, Provisioning GW, AAA on either private cloud or public cloud like AWS, OCP or GCP Expert knowledge of the software project lifecycle and CI/CD Pipelines A Bachelor degree in Computer Science, Computer Engineering, Electrical Engineering, STEM related field or relevant experience Great-to-haves: Understanding of 3GPP architectures and reference points for 4G and 5G wireless networks Knowledge of 3GPP, TMF, GSMA, IETF standard bodies Experience with Radio, Core, Transport and Infrastructure product design, development, integration, test and operations low level protocol implementation on top of UDP, SCTP, GTPv1 and GTPv2 Experience with MariaDB, Cassandra DB, MongoDB and Data Model Management AWS Fargate, Lambda, DynamoDB, SQS, Step Functions, CloudWatch, CloudFormation and/or AWS Cloud Development Kit Knowledge of Python, and API development in production environments Experience with containerization tools such as Docker, Kubernetes, and/or OpenStack technology Soft Skills: Strong analytical and problem-solving abilities Excellent communication skills, both written and verbal Ability to work effectively in a team environment Self-motivated with a proactive approach to learning new technologies Capable of working under pressure and managing multiple priorities EEO Statement At TELUS Digital, we enable customer experience innovation through spirited teamwork, agile thinking, and a caring culture that puts customers first. TELUS Digital is the global arm of TELUS Corporation, one of the largest telecommunications service providers in Canada. We deliver contact center and business process outsourcing (BPO) solutions to some of the world's largest corporations in the consumer electronics, finance, telecommunications and utilities sectors. With global call center delivery capabilities, our multi-shore, multi-language programs offer safe, secure infrastructure, value-based pricing, skills-based resources and exceptional customer service - all backed by TELUS, our multi-billion dollar telecommunications parent. Equal Opportunity Employer At TELUS Digital, we are proud to be an equal opportunity employer and are committed to creating a diverse and inclusive workplace. All aspects of employment, including the decision to hire and promote, are based on applicants’ qualifications, merits, competence and performance without regard to any characteristic related to diversity. Show more Show less
Posted 2 months ago
0 years
0 Lacs
Noida, Uttar Pradesh, India
On-site
Role: .Net Solution Architect Domain: Insurance Stack - MS Technology, Azure, Full stack etc.. (read below) What we look for? Strong Architecture, Conceptual clarity, problem solving, ability, Leadership quality, Client facing exp, depth in design, integration architecture. Responsibilities: For a change , we are mentioning what can qualify a profile and the depth of architecture required, incumbents may go through the requirements and share us their resume with a separate pg containing details on the asked questions. 1. Scaled Architecture Design Evaluation will be on candidate’s ability to design systems at scale (e.g., NPCI UPI, Uber, ONDC , RedBus): How well do you decompose monolithic systems into scalable components (e.g., ONDC architecture, NPCI as a central switch, geospatial for Uber)? Do you address critical non-functional requirements (NFRs) like latency (<1s for UPI), throughput (10k+ TPS), and fault tolerance? Patterns & Frameworks: Usage of event-driven architecture (Kafka), CQRS, Saga pattern (for distributed transactions), and Circuit Breaker (resilience). Challenges: Assess their approach to interoperability (UPI’s PSP integration), real-time data sync (Uber’s driver tracking), and overbooking prevention (RedBus). 2. Architecture Patterns vs. Styles How do you justify pattern choices (e.g., Bulkhead for RedBus vs. Service Mesh for Uber)? 3. Standards & Governance Compliance: Familiarity with TOGAF, ISO/IEC 42010, or Zachman frameworks. API Governance: Experience with tools like Apigee for centralized API management. 4. Self-Healing Systems Use of Kubernetes (auto-scaling, pod recovery), Chaos Engineering (Chaos Monkey), and observability stacks (Prometheus/Grafana). How do you implement retry mechanisms, circuit breakers, and automated rollbacks? 5. D2C Platform Expertise Proficiency in headless commerce (e.g., Contentful), CDPs (Customer Data Platforms like Segment), and API-first design. 6. Integration Architecture Experience with protocol transformation (REST to SOAP), message queuing (RabbitMQ/Kafka), and ETL pipelines (Apache NiFi 7. Security Architecture Beyond SSO/Auth: How do you implement zero-trust models, WAFs (Cloudflare), and SIEM tools (Splunk)? Do you address OWASP Top 10 risks (e.g., SQLi, XSS) and data encryption (AES-256, TLS 1.3)? 8. Platform KPIs & Metrics Focus on uptime (99.99% SLA), MTTR (<1hr), TPS, and cost optimization. How do you align KPIs with business goals (e.g., user growth vs. infrastructure scaling)? 9. Edge Computing – Edged based Processing Use of AWS Greengrass/Kubernetes Edge for local data processing. How do you address latency reduction, offline capabilities, and edge-node security? Show more Show less
Posted 2 months ago
0 years
0 Lacs
Agra, Uttar Pradesh, India
On-site
Major Accountabilities Collaborate with the CIO on application Architecture and Design of our ETL (Extract, Transform, Load) and other aspects of Data Pipelines. Our stack is built on top of the well-known Spark Ecosystem (e.g. Scala, Python, etc.) Periodically evaluate architectural landscape for efficiencies in our Data Pipelines and define current state, target state architecture and transition plans, road maps to achieve desired architectural state Conducts/leads and implements proof of concepts to prove new technologies in support of architecture vision and guiding principles (e.g. Flink) Assist in the ideation and execution of architectural principles, guidelines and technology standards that can be leveraged across the team and organization. Specially around ETL & Data Pipelines Promotes consistency between all applications leveraging enterprise automation capabilities Provide architectural consultation, support, mentoring, and guidance to project teams, e.g. architects, data scientist, developers, etc. Collaborate with the DevOps Lead on technical features Define and manage work items using Agile methodologies (Kanban, Azure boards, etc) Leads Data Engineering efforts (e.g. Scala Spark, PySpark, etc) Knowledge & Experience Experienced with Spark, Delta Lake, and Scala to work with Petabytes of data (to work with Batch and Streaming flows) Knowledge of a wide variety of open source technologies including but not limited to; NiFi, Kubernetes, Docker, Hive, Oozie, YARN, Zookeeper, PostgreSQL, RabbitMQ, Elasticsearch A strong understanding of AWS/Azure and/or technology as a service (Iaas, SaaS, PaaS) Strong verbal and written communications skills are a must, as well as the ability to work effectively across internal and external organizations and virtual teams Appreciation of building high volume, low latency systems for the API flow Core Dev skills (SOLID principles, IOC, 12-factor app, CI-CD, GIT) Messaging, Microservice Architecture, Caching (Redis), Containerization, Performance, and Load testing, REST APIs Knowledge of HTML, JavaScript frameworks (preferably Angular 2+), Typescript Appreciation of Python and C# .NET Core or Java Appreciation of global data privacy requirements and cryptography Experience in System Testing and experience of automated testing e.g. unit tests, integration tests, mocking/stubbing Relevant Industry And Other Professional Qualifications Tertiary qualifications (degree level) We are an inclusive employer and welcome applicants from all backgrounds. We pride ourselves on our commitment to Equality and Diversity and are committed to removing barriers throughout our hiring process. Key Requirements Extensive data engineering development experience (e.g., ETL), using well known stacks (e.g., Scala Spark) Experience in Technical Leadership positions (or looking to gain experience) Background software engineering The ability to write technical documentation Solid understanding of virtualization and/or cloud computing technologies (e.g., docker, Kubernetes) Experience in designing software solutions and enjoys UML and the odd sequence diagram Experience operating within an Agile environment Ability to work independently and with minimum supervision Strong project development management skills, with the ability to successfully manage and prioritize numerous time pressured analytical projects/work tasks simultaneously Able to pivot quickly and make rapid decisions based on changing needs in a fast-paced environment Works constructively with teams and acts with high integrity Passionate team player with an inquisitive, creative mindset and ability to think outside the box. Skills:- Java, Scala, Apache Spark, Spark, Hadoop and ETL Show more Show less
Posted 2 months ago
Upload Resume
Drag or click to upload
Your data is secure with us, protected by advanced encryption.
Browse through a variety of job opportunities tailored to your skills and preferences. Filter by location, experience, salary, and more to find your perfect fit.
We have sent an OTP to your contact. Please enter it below to verify.
Accenture
39817 Jobs | Dublin
Wipro
19388 Jobs | Bengaluru
Accenture in India
15458 Jobs | Dublin 2
EY
14907 Jobs | London
Uplers
11185 Jobs | Ahmedabad
Amazon
10459 Jobs | Seattle,WA
IBM
9256 Jobs | Armonk
Oracle
9226 Jobs | Redwood City
Accenture services Pvt Ltd
7971 Jobs |
Capgemini
7704 Jobs | Paris,France