Get alerts for new jobs matching your selected skills, preferred locations, and experience range.
10.0 years
0 Lacs
Sahibzada Ajit Singh Nagar, Punjab, India
On-site
Job Description Job Title: Chief AI Officer (CAIO) Location: Mohali Reports To: CEO Exp: 10+ Years About RChilli RChilli is a global leader in HR Tech, delivering AI-driven solutions for resume parsing, data enrichment, and talent acquisition. We are looking for a visionary Chief AI Officer (CAIO) to drive AI strategy, innovation, and ethical AI deployment in HRTech. Key Responsibilities AI Strategy & Leadership Develop and execute RChilli’s AI strategy aligned with business goals. Ensure ethical AI implementation and compliance with industry regulations. Be a change leader in adopting AI across the company. AI-Driven Product Innovation Lead AI research & development for NLP, machine learning, and predictive analytics. Implement AI for automated job descriptions, resume scoring, and candidate recommendations. Oversee AI-powered chatbots, workforce planning, and predictive retention models. Identify opportunities for AI implementation, including: Automated calls for candidate screening, interview scheduling, and feedback collection. AI-powered report generation for HR analytics, performance tracking, and compliance. AI-based note-taking and meeting summarization for enhanced productivity. Technology & Infrastructure Define and implement a scalable AI roadmap. Manage AI infrastructure, data lakes, ETL processes, and automation. Oversee data lakes and ETL tools such as Airflow and NiFi for efficient data management. Ensure robust data engineering and analysis frameworks. Generative AI, Conversational AI & Transformative AI Apply Generative AI for automating job descriptions, resume parsing, and intelligent recommendations. Leverage Conversational AI for chatbots, virtual assistants, and AI-driven HR queries. Utilize Transformative AI for workforce planning, sentiment analysis, and predictive retention models. Tool Identification & Implementation Identify business requirements and assess third-party AI tools available in the market. Implement and integrate AI tools to enhance operations and optimize business processes. Business Integration & Operations Collaborate with cross-functional teams to integrate AI into HRTech solutions. Understand and optimize business processes for AI adoption. Align AI-driven processes with business efficiency and customer needs. Leadership & Talent Development Build and mentor an AI team, fostering a culture of innovation. Promote AI literacy across the organization. Industry Thought Leadership Represent RChilli in AI forums, conferences, and industry partnerships. Stay ahead of AI trends and HRTech advancements. Technical Skills Required Skills & Qualifications: Master’s/Ph.D. in Computer Science, AI, Data Science, or related field. 10+ years of experience in AI/ML, with 5+ years in leadership roles. Expertise in NLP, machine learning, deep learning, and predictive analytics. Experience in AI ethics, governance, and compliance frameworks. Strong proficiency in AI infrastructure, data engineering, and automation tools. Understanding of data lakes, ETL processes, Airflow, and NiFi tools. Clear concepts in data engineering and analysis. Leadership & Business Skills Strategic thinker with the ability to align AI innovation with business goals. Excellent communication and stakeholder management skills. Experience in building and leading AI teams. Why Join RChilli? Lead AI Innovation: Shape AI-driven HR solutions in a globally recognized HRTech company. Impactful Work: Drive AI transformations in HR operations and talent acquisition. Growth & Learning: Work with a passionate AI research and product team. Competitive Package: Enjoy a competitive salary, benefits, and career growth opportunities. If you are a visionary AI leader ready to transform HRTech, join RChilli as our Chief AI Officer. Show more Show less
Posted 2 weeks ago
7.0 years
0 Lacs
Hyderabad, Telangana, India
On-site
Requirements Description and Requirements Position Summary: A Big Data (Hadoop) Administrator responsible for supporting the installation, configuration, and maintenance of Cloudera Data Platform (CDP) and Cloudera Flow Management (CFM) streaming clusters on RedHat Linux. Strong expertise in DevOps practices, automation, and scripting (e.g . Ansible , Azure DevOps, Shell, Python ) to streamline operations and improve efficiency is highly valued. Job Responsibilities: Assist in the installation, configuration, and maintenance of Cloudera Data Platform (CDP) and Cloudera Flow Management (CFM) streaming clusters on RedHat Linux. Perform routine monitoring, troubleshooting, and issue resolution to ensure the stability and performance of Hadoop clusters. Develop and maintain scripts (e.g., Python, Bash, Ansible) to automate operational tasks and improve system efficiency. Collaborate with cross-functional teams, including application development, infrastructure, and operations, to support business requirements and implement new features. Implement and follow best practices for cluster security, including user access management and integration with tools like Apache Ranger and Kerberos. Support backup, recovery, and disaster recovery processes to ensure data availability and business continuity. Conduct performance tuning and optimization of Hadoop clusters to enhance system efficiency and reduce latency. Analyze logs and use tools like Splunk to debug and resolve production issues. Document operational processes, maintenance procedures, and troubleshooting steps to ensure knowledge sharing and consistency. Stay updated on emerging technologies and contribute to the adoption of new tools and practices to improve cluster management. Education: Bachelor’s degree in computer science, Information Systems, or another related field with 7+ years of IT and Infrastructure engineering work experience. Experience: 7+ Years Total IT experience & 4+ Years relevant experience in Big Data database. Technical Skills: Big Data Platform Management : Big Data Platform Management: Knowledge in managing and optimizing the Cloudera Data Platform, including components such as Apache Hadoop (YARN and HDFS), Apache HBase, Apache Solr , Apache Hive, Apache Kafka, Apache NiFi , Apache Ranger, Apache Spark, as well as JanusGraph and IBM BigSQL . Automation and Scripting : Expertise in automation tools and scripting languages such as Ansible, Python, and Bash to streamline operational tasks and improve efficiency. DevOps Practices : Proficiency in DevOps tools and methodologies, including CI/CD pipelines, version control systems (e.g., Git), and infrastructure-as-code practices. Monitoring and Troubleshooting : Experience with monitoring and observability tools such as Splunk, Elastic Stack, or Prometheus to identify and resolve system issues. Linux Administration : Solid knowledge of Linux operating systems, including system administration, troubleshooting, and performance tuning. Backup and Recovery : Familiarity with implementing and managing backup and recovery processes to ensure data availability and business continuity. Security and Access Management : Understanding of security best practices, including user access management and integration with tools like Kerberos. Agile Methodologies : Knowledge of Agile practices and frameworks, such as SAFe , with experience working in Agile environments. ITSM Tools : Familiarity with ITSM processes and tools like ServiceNow for incident and change management. About MetLife Recognized on Fortune magazine's list of the 2024 "World's Most Admired Companies" and Fortune World’s 25 Best Workplaces™ for 2024, MetLife , through its subsidiaries and affiliates, is one of the world’s leading financial services companies; providing insurance, annuities, employee benefits and asset management to individual and institutional customers. With operations in more than 40 markets, we hold leading positions in the United States, Latin America, Asia, Europe, and the Middle East. Our purpose is simple - to help our colleagues, customers, communities, and the world at large create a more confident future. United by purpose and guided by empathy, we’re inspired to transform the next century in financial services. At MetLife, it’s #AllTogetherPossible . Join us! Show more Show less
Posted 2 weeks ago
3.0 years
0 Lacs
Pune, Maharashtra, India
On-site
Greetings from TCS!!! TCS has been a great pioneer in feeding the fire of young Techies like you. We are a global leader in the technology arena and there’s nothing that can stop us from growing together. Your role is of key importance, as it lays down the foundation for the entire project. Position: QA-Big Data Job Location: Bangalore / Pune Experience: 8+ Interview Mode: Virtual (MS Teams) JD: Job Title: Manual/ Automation QA Tester - Spark, Scala, and Apache NiFi, Java Key Responsibilities: Develop, execute, and maintain automated and Manual test scripts for data processing pipelines built using Apache Spark and Apache NiFi. Perform end-to-end testing of ETL processes, including data extraction, transformation, and loading. Design and implement test plans, test cases, and test data for functional, regression, and performance testing. Collaborate with developers, data engineers, and product managers to understand requirements, identify test scenarios, and ensure test coverage. Analyze test results, identify defects, and work closely with the development team to troubleshoot and resolve issues. Monitor, report, and track the quality of data processing and ETL jobs in a big data environment. Preferred Qualifications: Experience with Big Data technologies such as Spark Knowledge of any other programming languages like Python or Java. Experience with performance testing tools and techniques for data processing workloads. Familiarity with cloud platforms such as AWS, Azure, or Google Cloud. Work Location India: Bangalore / Pune TCS Eligibility Criteria: *BE/B.tech/MCA/M.Sc./MS with minimum 3 years of relevant IT-experience post Qualification. *Only Full-Time courses would be considered. *Candidates who have attended TCS interview within 1 month need not apply. Referrals are always welcome!!! Thanks & Regards Jerin L Varghese Show more Show less
Posted 2 weeks ago
7.0 years
0 Lacs
Pune, Maharashtra, India
On-site
Requirements Description and Requirements Position Summary: A Big Data (Hadoop) Administrator responsible for supporting the installation, configuration, and maintenance of Cloudera Data Platform (CDP) and Cloudera Flow Management (CFM) streaming clusters on RedHat Linux. Strong expertise in DevOps practices, automation, and scripting (e.g . Ansible , Azure DevOps, Shell, Python ) to streamline operations and improve efficiency is highly valued. Job responsibilities: Assist in the installation, configuration, and maintenance of Cloudera Data Platform (CDP) and Cloudera Flow Management (CFM) streaming clusters on RedHat Linux. Perform routine monitoring, troubleshooting, and issue resolution to ensure the stability and performance of Hadoop clusters. Develop and maintain scripts (e.g., Python, Bash, Ansible) to automate operational tasks and improve system efficiency. Collaborate with cross-functional teams, including application development, infrastructure, and operations, to support business requirements and implement new features. Implement and follow best practices for cluster security, including user access management and integration with tools like Apache Ranger and Kerberos. Support backup, recovery, and disaster recovery processes to ensure data availability and business continuity. Conduct performance tuning and optimization of Hadoop clusters to enhance system efficiency and reduce latency. Analyze logs and use tools like Splunk to debug and resolve production issues. Document operational processes, maintenance procedures, and troubleshooting steps to ensure knowledge sharing and consistency. Stay updated on emerging technologies and contribute to the adoption of new tools and practices to improve cluster management. Education: Bachelor’s degree in computer science, Information Systems, or another related field with 7+ years of IT and Infrastructure engineering work experience. Experience: 7+ Years Total IT experience & 4+ Years relevant experience in Big Data database. Big Data Platform Management : Big Data Platform Management: Knowledge in managing and optimizing the Cloudera Data Platform, including components such as Apache Hadoop (YARN and HDFS), Apache HBase, Apache Solr , Apache Hive, Apache Kafka, Apache NiFi , Apache Ranger, Apache Spark, as well as JanusGraph and IBM BigSQL . Automation and Scripting : Expertise in automation tools and scripting languages such as Ansible, Python, and Bash to streamline operational tasks and improve efficiency. DevOps Practices : Proficiency in DevOps tools and methodologies, including CI/CD pipelines, version control systems (e.g., Git), and infrastructure-as-code practices. Monitoring and Troubleshooting : Experience with monitoring and observability tools such as Splunk, Elastic Stack, or Prometheus to identify and resolve system issues. Linux Administration : Solid knowledge of Linux operating systems, including system administration, troubleshooting, and performance tuning. Backup and Recovery : Familiarity with implementing and managing backup and recovery processes to ensure data availability and business continuity. Security and Access Management : Understanding of security best practices, including user access management and integration with tools like Kerberos. Agile Methodologies : Knowledge of Agile practices and frameworks, such as SAFe , with experience working in Agile environments. ITSM Tools : Familiarity with ITSM processes and tools like ServiceNow for incident and change management. About MetLife Recognized on Fortune magazine's list of the 2024 "World's Most Admired Companies" and Fortune World’s 25 Best Workplaces™ for 2024, MetLife , through its subsidiaries and affiliates, is one of the world’s leading financial services companies; providing insurance, annuities, employee benefits and asset management to individual and institutional customers. With operations in more than 40 markets, we hold leading positions in the United States, Latin America, Asia, Europe, and the Middle East. Our purpose is simple - to help our colleagues, customers, communities, and the world at large create a more confident future. United by purpose and guided by empathy, we’re inspired to transform the next century in financial services. At MetLife, it’s #AllTogetherPossible . Join us! Show more Show less
Posted 2 weeks ago
4.0 years
0 Lacs
Gurugram, Haryana, India
On-site
About the Role: As a Big Data Engineer, you will play a critical role in integrating multiple data sources, designing scalable data workflows, and collaborating with data architects, scientists, and analysts to develop innovative solutions. You will work with rapidly evolving technologies to achieve strategic business goals. Must-Have Skills: 4+ year’s of mandatory experience with Big data. 4+ year’s mandatory experience in Apache Spark. Proficiency in Apache Spark, Hive on Tez, and Hadoop ecosystem components. Strong coding skills in Python & PySpark. Experience building reusable components or frameworks using Spark Expertise in data ingestion from multiple sources using APIs, HDFS, and NiFi. Solid experience working with structured, unstructured, and semi-structured data formats (Text, JSON, Avro, Parquet, ORC, etc.). Experience with UNIX Bash scripting and databases like Postgres, MySQL and Oracle. Ability to design, develop, and evolve fault-tolerant distributed systems. Strong SQL skills, with expertise in Hive, Impala, Mongo and NoSQL databases. Hands-on with Git and CI/CD tools Experience with streaming data technologies (Kafka, Spark Streaming, Apache Flink, etc.). Proficient with HDFS, or similar data lake technologies Excellent problem-solving skills — you will be evaluated through coding rounds Key Responsibilities: Must be capable of handling existing or new Apache HDFS cluster having name node, data node & edge node commissioning & decommissioning. Work closely with data architects and analysts to design technical solutions. Integrate and ingest data from multiple source systems into big data environments. Develop end-to-end data transformations and workflows, ensuring logging and recovery mechanisms. Must be able to troubleshoot spark job failures. Design and implement batch, real-time, and near-real-time data pipelines. Optimize Big Data transformations using Apache Spark, Hive, and Tez Work with Data Science teams to enhance actionable insights. Ensure seamless data integration and transformation across multiple systems. Show more Show less
Posted 2 weeks ago
3.0 - 8.0 years
0 Lacs
Pune, Maharashtra, India
On-site
Greetings from TCS!!! TCS has been a great pioneer in feeding the fire of young Techies like you. We are a global leader in the technology arena and there’s nothing that can stop us from growing together. Your role is of key importance, as it lays down the foundation for the entire project. Job Title: : Big Data Developer Job Location: Pune Experience: 4-7 Must have skillset: Pyspark, Scala, NiFi ,Hadoop High level job description for LVS exit: Professional Bigdata Hadoop development experience between 3-8 years is preferred. Expertise with Big Data ecosystem services, such as Spark(Scala/Python), Hive, Kafka, Unix and experience with any cloud stack, preferably GCP(Big Query & DataProc) Experience in working with large cloud data lakes. Experience with large-scale data processing, complex event processing, stream processing. Experience in working with CI/CD pipelines, source code repositories, and operating environments. Experience in working with both structured and unstructured data, with a high degree of SQL knowledge. Experience designing and implementing scalable ETL/ELT processes and modeling data for low latency reporting Experience in performance tuning, troubleshooting and diagnostics, process monitoring, and profiling. Understanding containerization, virtualization, and cloud computing TCS Eligibility Criteria: *BE/B.tech/MCA/M.Sc./MS with minimum 3 years of relevant IT-experience post Qualification. *Only Full-Time courses would be considered. *Candidates who have attended TCS interview within 1 month need not apply. Referrals are always welcome!!! Thanks & Regards Kavya T Talent Acquisition Associate Tata Consultancy Services Show more Show less
Posted 2 weeks ago
2.0 years
0 Lacs
Pune, Maharashtra, India
On-site
Who Are We Fulcrum Digital is an agile and next-generation digital accelerating company providing digital transformation and technology services right from ideation to implementation. These services have applicability across a variety of industries, including banking & financial services, insurance, retail, higher education, food, health care, and manufacturing. The Role Plan, manage, and oversee all aspects of a Production Environment for Big Data Platforms. Define strategies for Application Performance Monitoring, Optimization in Prod environment. Respond to Incidents and improvise platform based on feedback and measure the reduction of incidents over time. Ensures that batch production scheduling and process are accurate and timely. Able to create and execute queries to big data platform and relational data tables to identify process issues or to perform mass updates, preferred. Performs ad hoc requests from users such as data research, file manipulation/transfer, research of process issues, etc. Take a holistic approach to problem-solving, by connecting the dots during a production event through the various technology stack that makes up the platform, to optimize meantime to recover. Engage in and improve the whole lifecycle of servicesfrom inception and design, through deployment, operation and refinement. Analyze ITSM activities of the platform and provide feedback loop to development teams on operational gaps or resiliency concerns. Support services before they go live through activities such as system design consulting, capacity planning, and launch reviews. Support the application CI/CD pipeline for promoting software into higher environments through validation and operational gating, and lead in DevOps automation and best practices. Maintain services once they are live by measuring and monitoring availability, latency, and overall system health. Scale systems sustainably through mechanisms like automation and evolving systems by pushing for changes that improve reliability and velocity. Work with a global team spread across tech hubs in multiple geographies and time zones. Ability to share knowledge and explain processes and procedures to others. Requirements Experience in Linux and Knowledge on ITSM/ITIL. Experience in the Big Data technologies (Hadoop, Spark, Nifi, Impala). 2 years of Experience in running Big Data production systems. Good to have experience in industry standard CI/CD tools like Git/BitBucket, Jenkins, Maven. Solid grasp of SQL or Oracle fundamentals. Experience with scripting, pipeline management, and software design. Systematic problem-solving approach, coupled with strong communication skills and a sense of ownership and drive. Ability to help debug and optimize code and automate routine tasks. Ability to support many different stakeholders. Experience in dealing with difficult situations and making decisions with a sense of urgency is needed. Appetite for change and pushing the boundaries of what can be done with automation. Experience in working across development, operations, and product teams to prioritize needs and to build relationships are a must. Experience designing and implementing an effective and efficient CI/CD flow that gets code from dev to prod with high quality and minimal manual effort is desired. Good Handle on Change Management and Release Management aspects of Software. Locations - Pune, India Show more Show less
Posted 2 weeks ago
5.0 - 8.0 years
15 - 18 Lacs
Coimbatore
Hybrid
Role & responsibilities Designing and building optimized data pipelines using cutting-edge technologies in a cloud environment to drive analytical insights. Constructing infrastructure for efficient ETL processes from various sources and storage systems. Leading the implementation of algorithms and prototypes to transform raw data into useful information. Architecting, designing, and maintaining database pipeline architectures, ensuring readiness for AI/ML transformations. Creating innovative data validation methods and data analysis tools. Ensuring compliance with data governance and security policies. Interpreting data trends and patterns to establish operational alerts. Developing analytical tools, programs, and reporting mechanisms. Conducting complex data analysis and presenting results effectively. Preparing data for prescriptive and predictive modeling. Continuously exploring opportunities to enhance data quality and reliability. Applying strong programming and problem-solving skills to develop scalable solutions.
Posted 2 weeks ago
12.0 - 18.0 years
20 - 30 Lacs
Bengaluru
Work from Office
Role: Senior Manager Exp: 12+ yrs Budget: Max 30LPA Bangalore Imm Joiners Graduation: BE, Btech, ME, Mtech Exposure to Spring, NiFi, Kafka, Postgres, Elasticsearch, Java, Ansible, Postgres,Angular, Node JS, Python, React, Mongodb,CI/CD DevOps.
Posted 2 weeks ago
0 years
0 Lacs
Pune, Maharashtra, India
On-site
Our Purpose Mastercard powers economies and empowers people in 200+ countries and territories worldwide. Together with our customers, we’re helping build a sustainable economy where everyone can prosper. We support a wide range of digital payments choices, making transactions secure, simple, smart and accessible. Our technology and innovation, partnerships and networks combine to deliver a unique set of products and services that help people, businesses and governments realize their greatest potential. Title And Summary Software Development Engineer II Software Development Engineer (Data Engineering) Overview Mastercard is the global technology company behind the world’s fastest payments processing network. We are a vehicle for commerce, a connection to financial systems for the previously excluded, a technology innovation lab, and the home of Priceless®. We ensure every employee has the opportunity to be a part of something bigger and to change lives. We believe as our company grows, so should you. We believe in connecting everyone to endless, priceless possibilities. Enterprise Data Solution (EDS) is focused on enabling insights into Mastercard network and help build data-driven products by curating and preparing data in a secure and reliable manner. Moving to a “Unified and Fault-Tolerant Architecture for Data Ingestion and Processing” is critical to achieving this mission. As a Software Development Engineer (Data Engineering) in Enterprise Data Solution (EDS), you will have the opportunity to build high performance data pipelines to load into Mastercard Data Warehouse. Our Data Warehouse provides analytical capabilities to number of business users who help different customers provide answer to their business problems through data. You will play a vital role within a rapidly growing organization, while working closely with experienced and driven engineers to solve challenging problems. Role Participant medium-to-large size data engineering projects Discover, ingest, and incorporate new sources of real-time, streaming, batch, and API-based data into our platform to enhance the insights we get from running tests and expand the ways and properties on which we can test Assist business in utilizing data-driven insights to drive growth and transformation. Build and maintain data processing workflows feeding Mastercard analytics domains. Facilitate reliable integrations with internal systems and third-party API's as needed. Support data analysts as needed, advising on data definitions and helping them derive meaning from complex datasets. Work with cross functional agile teams to drive projects through full development cycle. Help the team improve with the usage of data engineering best practices. Collaborate with other data engineering teams to improve the data engineering ecosystem and talent within Mastercard. All About You At least Bachelor's degree in Computer Science, Computer Engineering or Technology related field or equivalent work experience Experience in Data Warehouse related projects in product or service based organization Expertise in Data Engineering and implementing multiple end-to-end DW projects in Big Data environment Experience of working with Databases like Oracle, Netezza and have strong SQL knowledge Additional experience of building data pipelines through Spark with Scala/Python/Java on Hadoop is preferred Experience of working on Nifi will be an added advantage Experience of working in Agile teams Strong analytical skills required for debugging production issues, providing root cause and implementing mitigation plan Strong communication skills - both verbal and written – and strong relationship, collaboration skills and organizational skills Ability to be high-energy, detail-oriented, proactive and able to function under pressure in an independent environment along with a high degree of initiative and self-motivation to drive results Ability to quickly learn and implement new technologies, and perform POC to explore best solution for the problem statement Flexibility to work as a member of a matrix based diverse and geographically distributed project teams Corporate Security Responsibility All activities involving access to Mastercard assets, information, and networks comes with an inherent risk to the organization and, therefore, it is expected that every person working for, or on behalf of, Mastercard is responsible for information security and must: Abide by Mastercard’s security policies and practices; Ensure the confidentiality and integrity of the information being accessed; Report any suspected information security violation or breach, and Complete all periodic mandatory security trainings in accordance with Mastercard’s guidelines. R-246732 Show more Show less
Posted 2 weeks ago
1.0 years
0 Lacs
Pune, Maharashtra, India
On-site
Velotio Technologies is a product engineering company working with innovative startups and enterprises. We are a certified Great Place to Work® and recognized as one of the best companies to work for in India. We have provided full-stack product development for 110+ startups across the globe building products in the cloud-native, data engineering, B2B SaaS, IoT & Machine Learning space. Our team of 400+ elite software engineers solves hard technical problems while transforming customer ideas into successful products. Requirements Design and build scalable data infrastructure with efficiency, reliability, and consistency to meet rapidly growing data needs Build the applications required for optimal extraction, cleaning, transformation, and loading data from disparate data sources and formats using the latest big data technologies Building ETL/ELT pipelines and work with other data infrastructure components, like Data Lakes, Data Warehouses and BI/reporting/analytics tools Work with various cloud services like AWS, GCP, Azure to implement highly available, horizontally scalable data processing and storage systems and automate manual processes and workflows Implement processes and systems to monitor data quality, to ensure data is always accurate, reliable, and available for the stakeholders and other business processes that depend on it Work closely with different business units and engineering teams to develop a long-term data platform architecture strategy and thus foster data-driven decision-making practices across the organization Help establish and maintain a high level of operational excellence in data engineering Evaluate, integrate, and build tools to accelerate Data Engineering, Data Science, Business Intelligence, Reporting, and Analytics as needed Focus on building test-driven development by writing unit/integration tests Contribute to design documents and engineering wiki You will enjoy this role if you... Like building elegant well-architected software products with enterprise customers Want to learn to leverage public cloud services & cutting-edge big data technologies, like Spark, Airflow, Hadoop, Snowflake, and Redshift Work collaboratively as part of a close-knit team of geeks, architects, and leads Desired Skills & Experience: 1+ years of data engineering or equivalent knowledge and ability 1+ years software engineering or equivalent knowledge and ability Strong proficiency in at least one of the following programming languages: Python, Scala, or Java Experience designing and maintaining at least one type of database (Object Store, Columnar, In-memory, Relational, Tabular, Key-Value Store, Triple-store, Tuple-store, Graph, and other related database types) Good understanding of star/snowflake schema designs Extensive experience working with big data technologies like Spark, Hadoop, Hive Experience building ETL/ELT pipelines and working on other data infrastructure components like BI/reporting/analytics tools Experience working with workflow orchestration tools like Apache Airflow, Oozie, Azkaban, NiFi, Airbyte, etc. Experience building production-grade data backup/restore strategies and disaster recovery solutions Hands-on experience with implementing batch and stream data processing applications using technologies like AWS DMS, Apache Flink, Apache Spark, AWS Kinesis, Kafka, etc. Knowledge of best practices in developing and deploying applications that are highly available and scalable Experience with or knowledge of Agile Software Development methodologies Excellent problem-solving and troubleshooting skills Process-oriented with excellent documentation skills Bonus points if you: Have hands-on experience using one or multiple cloud service providers like AWS, GCP, Azure and have worked with specific products like EMR, Glue, DataProc, DataBricks, DataStudio, etc Have hands-on experience working with either Redshift, Snowflake, BigQuery, Azure Synapse, or Athena and understand the inner workings of these cloud storage systems Have experience building DataLakes, scalable data warehouses, and DataMarts Have familiarity with tools like Jupyter Notebooks, Pandas, NumPy, SciPy, sci-kit learn, Seaborn, SparkML, etc. Have experience building and deploying Machine Learning models to production at scale Possess excellent cross-functional collaboration and communication skills Our Culture : We have an autonomous and empowered work culture encouraging individuals to take ownership and grow quickly Flat hierarchy with fast decision making and a startup-oriented “get things done” culture A strong, fun & positive environment with regular celebrations of our success. We pride ourselves in creating an inclusive, diverse & authentic environment At Velotio, we embrace diversity. Inclusion is a priority for us, and we are eager to foster an environment where everyone feels valued. We welcome applications regardless of ethnicity or cultural background, age, gender, nationality, religion, disability or sexual orientation. Show more Show less
Posted 2 weeks ago
12.0 - 18.0 years
0 Lacs
Trivandrum, Kerala, India
On-site
Role Description Observability Engineer Experience: 12 to 18 Years We are looking for an experienced Observability Engineer to architect, implement, and maintain enterprise-grade monitoring solutions. This role demands deep expertise across observability tools, infrastructure monitoring, log analytics, and cloud-native environments. Key Responsibilities Architect and manage observability frameworks using LogicMonitor, ServiceNow, BigPanda, and NiFi Implement log analytics and security monitoring with Azure Log Analytics and Azure Sentinel Build real-time dashboards using KQL, Splunk, and Grafana suite (Alloy, Beyla, K6, Loki, Thanos, Tempo) Lead infrastructure observability strategy for AKS (Azure Kubernetes Service) Automate observability workflows with PowerShell, GitHub, and API Management Collaborate with DevOps, cloud, and platform teams to ensure end-to-end system visibility and performance Core Skills Required Monitoring & ing: LogicMonitor, BigPanda, ServiceNow, NiFi Log Analytics & SIEM: Azure Log Analytics, Azure Sentinel, KQL Dashboards & Visualization: Grafana suite, Splunk Cloud & Containers: AKS, Data Pipelines Automation & DevOps: GitHub, PowerShell, API Management Preferred Skills Working knowledge of Cribl for log routing and filtering Familiarity with distributed tracing, advanced metrics, and telemetry strategies Soft Skills & Expectations Strong cross-functional collaboration and stakeholder communication Ability to drive initiatives independently and mentor junior engineers Eagerness to explore and adopt emerging observability tools and trends Skills big panda,Azure Automation,Azure Show more Show less
Posted 2 weeks ago
12.0 years
0 Lacs
Pune, Maharashtra, India
On-site
Come work at a place where innovation and teamwork come together to support the most exciting missions in the world! Director of Engineering, Connectors and Partner Integrations About The Role We are seeking an experienced and strategic Director of Engineering for Connectors and Platform Integrations to lead and scale our efforts in building high-impact integrations across cloud platforms, third-party applications (on-premises and cloud), security tools, and partner ecosystems. This role is crucial in enhancing the interoperability of the Qualys Enterprise TruRisk Platform with the broader security and IT operations ecosystem. You will lead multiple engineering teams focused on developing scalable connectors, APIs, SDKs, and integration solutions that empower our customers to extract maximum value from the Qualys Enterprise TruRisk Platform. The successful candidate will have a proven track record in building and managing high impact connectors in cybersecurity. Key Responsibilities Lead engineering efforts for developing and maintaining connectors and integrations with third-party platforms, including cloud providers (AWS, Azure, GCP), security tools, ITSM systems, and other enterprise applications. Build and foster strong technical partnerships with vendors, technology partners, and integration collaborators to expand the Qualys Enterprise TruRisk Platform ecosystem. Collaborate with Cross-Functional Engineering Teams, Product Management, Solution Architects, and Sales Engineering teams to define integration strategies and prioritize development based on customer needs and strategic initiatives. Oversee the architecture and delivery of integration components to ensure they meet performance, scalability, and security requirements. Manage, mentor, and scale high-performing engineering teams, focusing on execution, innovation, and excellence. Own the roadmap and execution for integration-related initiatives, ensuring on-time delivery and alignment with business goals. Act as a senior technical leader, driving engineering best practices and fostering a culture of continuous improvement and collaboration. Represent Qualys in partner-facing engagements, technical workshops, and integration strategy meetings. Qualifications 12+ years of experience in software engineering, with at least 5+ years in a senior leadership role. Proven track record in building and delivering enterprise-scale platform integrations and connectors for technologies such as SIEM, SOAR, CMDB, Ticketing Systems and ITSM integrations Strong experience working with cloud providers (AWS, Azure, GCP), RESTful APIs, webhooks, message brokers, and modern integration frameworks (Apache Camel, Apache NiFi). Knowledge of API gateways, authentication protocols (OAuth, SAML), and integration security best practices. Familiarity with data normalization, transformation, and sync mechanisms with Connector development Deep understanding of partner ecosystem management, including collaboration, co-development, and joint delivery. Exposure to working with partner certification programs Excellent stakeholder management and communication skills; able to bridge technical and business conversations across internal and external teams. Demonstrated ability to lead cross-functional initiatives and manage engineering teams in a distributed and agile environment. Expertise with programming languages such as Java Bachelor’s or Master’s degree in Computer Science, Engineering, or related field. Show more Show less
Posted 2 weeks ago
0 years
0 Lacs
Indore, Madhya Pradesh, India
On-site
What is your Role? You will work in a multi-functional role with a combination of expertise in System and Hadoop administration. You will work in a team that often interacts with customers on various aspects related to technical support for deployed system. You will be deputed at customer premises to assist customers for issues related to System and Hadoop administration. You will Interact with QA and Engineering team to co-ordinate issue resolution within the promised SLA to customer. What will you do? Deploying and administering Hortonworks, Cloudera, Apache Hadoop/Spark ecosystem. Installing Linux Operating System and Networking. Writing Unix SHELL/Ansible Scripting for automation. Maintaining core components such as Zookeeper, Kafka, NIFI, HDFS, YARN, REDIS, SPARK, HBASE etc. Takes care of the day-to-day running of Hadoop clusters using Ambari/Cloudera manager/Other monitoring tools, ensuring that the Hadoop cluster is up and running all the time. Maintaining HBASE Clusters and capacity planning. Maintaining SOLR Cluster and capacity planning. Work closely with the database team, network team and application teams to make sure that all the big data applications are highly available and performing as expected. Manage KVM Virtualization environment. What skills you should have? Technical Domain: Linux administration, Hadoop Infrastructure and Administration, SOLR, Configuration Management (Ansible etc). Show more Show less
Posted 2 weeks ago
8.0 years
0 Lacs
Pune, Maharashtra, India
On-site
Role: Lead/Senior Big Data Engineer Type: Full-time Location: Pune Exp: 8+ years Must Notice Period: Only Immediate joiners To be successful in this role, you should possess Collaborate closely with Product Management and Engineering leadership to devise and build the right solution. Participate in Design discussions and brainstorming sessions to select, integrate, and maintain Big Data tools and frameworks required to solve Big Data problems at scale. Design and implement systems to cleanse, process, and analyze large data sets using distributed processing tools like Akka and Spark. Understanding and critically reviewing existing data pipelines, and coming up with ideas in collaboration with Technical Leaders and Architects to improve upon current bottlenecks Take initiatives, and show the drive to pick up new stuff proactively, and work as a Senior Individual contributor on the multiple products and features we have. 7+ years of experience in developing highly scalable Big Data pipelines. In-depth understanding of the Big Data ecosystem including processing frameworks like Spark, Akka, Storm, and Hadoop, and the file types they deal with. Experience with ETL and Data pipeline tools like Apache NiFi, Airflow etc. Excellent coding skills in Java or Scala, including the understanding to apply appropriate Design Patterns when required. Experience with Git and build tools like Gradle/Maven/SBT. Strong understanding of object-oriented design, data structures, algorithms, profiling, and optimization. Have elegant, readable, maintainable and extensible code style. You are someone who would easily be able to Work closely with the US and India engineering teams to help build the Java/Scala based data pipelines Lead the India engineering team in technical excellence and ownership of critical modules; own the development of new modules and features Troubleshoot live production server issues. Handle client coordination and be able to work as a part of a team, be able to contribute independently and drive the team to exceptional contributions with minimal team supervision Follow Agile methodology, JIRA for work planning, issue management/tracking Additional Project/Soft Skills: Should be able to work independently with India & US based team members. Strong verbal and written communication with ability to articulate problems and solutions over phone and emails. Strong sense of urgency, with a passion for accuracy and timeliness. Ability to work calmly in high pressure situations and manage multiple projects/tasks. Ability to work independently and possess superior skills in issue resolution. Should have the passion to learn and implement, analyse and troubleshoot issues Show more Show less
Posted 2 weeks ago
5.0 years
0 Lacs
Mohali, Punjab
On-site
Job Information Date Opened 05/28/2025 Job Type Full time Industry Education Work Experience 5+ years City S.A.S.Nagar (Mohali) State/Province Punjab Country India Zip/Postal Code 160062 Job Description About Data Couch Pvt. Ltd. Data Couch Pvt. Ltd. is a leading consulting and enterprise training company, specializing in Data Engineering, Big Data, Cloud Technologies, DevOps, and AI/ML. With a footprint across India and strategic global client partnerships, we empower organizations through data-driven decision-making and digital transformation. Our team of expert consultants and trainers works hands-on with the latest technologies to deliver enterprise-grade solutions and skill-building programs. Key Responsibilities Design, develop, and optimize scalable data pipelines using PySpark. Leverage Hadoop ecosystem tools (e.g., HDFS, Hive) for big data processing. Build and manage data workflows on cloud platforms (AWS, GCP, Azure). Utilize Kubernetes for orchestration of containerized data services. Collaborate with data scientists, analysts, and engineers to integrate data solutions. Monitor, maintain, and improve data workflows for performance and reliability. Ensure data governance, compliance, and best practices in documentation. Support MLOps and the integration of AI/ML pipelines into data workflows. Requirements Must-Haves: 6–7+ years of experience in data engineering roles. Strong expertise in PySpark for ETL, transformation, and distributed data processing. Hands-on experience with at least one major cloud platform (AWS, GCP, Azure). Solid understanding of Hadoop tools like HDFS, Hive, etc. Experience with Kubernetes for container orchestration. Proficient in Python and SQL. Experience working with large-scale, distributed data systems. Good-to-Have: Familiarity with tools like Apache Airflow, Kafka, NiFi, or Databricks. Experience with cloud data warehouses such as Snowflake, Redshift, or BigQuery. Exposure to MLOps frameworks such as MLflow, TensorFlow, or PyTorch. Understanding of DevOps, CI/CD, and version control (Git, GitLab, Jenkins). Benefits Innovative Environment: Thrive in a collaborative culture that values innovation and continuous improvement. Learning & Growth: Access to internal training programs, certifications, and mentorship from industry experts. Career Development: Structured growth opportunities and competitive compensation. Cutting-Edge Tech: Work with modern data technologies and contribute to digital transformation initiatives. Health & Wellbeing: Comprehensive medical insurance coverage for you. Work-Life Balance: Generous paid leave .
Posted 2 weeks ago
1.0 - 5.0 years
3 - 7 Lacs
Mumbai
Work from Office
Data Engineer identifies the business problem and translates these to data services and engineering outcomes. You will deliver data solutions that empower better decision making and flexibility of your solution that scales to respond to broader business questions. key responsibilities As a Data Engineer, you are a full-stack data engineer that loves solving business problems. You work with business leads, analysts and data scientists to understand the business domain and engage with fellow engineers to build data products that empower better decision making. You are passionate about data quality of our business metrics and flexibility of your solution that scales to respond to broader business questions. If you love to solve problems using your skills, then come join the Team Searce. We have a casual and fun office environment that actively steers clear of rigid "corporate" culture, focuses on productivity and creativity, and allows you to be part of a world-class team while still being yourself. Consistently strive to acquire new skills on Cloud, DevOps, Big Data, AI and ML Understand the business problem and translate these to data services and engineering outcomes Explore new technologies and learn new techniques to solve business problems creatively Think big! and drive the strategy for better data quality for the customers Collaborate with many teams - engineering and business, to build better data products preferred qualifications Over 1-2 years of experience with Hands-on experience of any one programming language (Python, Java, Scala) Understanding of SQL is must Big data (Hadoop, Hive, Yarn, Sqoop) MPP platforms (Spark, Pig, Presto) Data-pipeline & schedular tool (Ozzie, Airflow, Nifi) Streaming engines (Kafka, Storm, Spark Streaming) Any Relational database or DW experience Any ETL tool experience Hands-on experience in pipeline design, ETL and application development Good communication skills Experience in working independently and strong analytical skills Dependable and good team player Desire to learn and work with new technologies Automation in your blood
Posted 3 weeks ago
4.0 years
0 Lacs
Gurugram, Haryana, India
On-site
Company Description Airtel is a leading global connectivity provider dedicated to unlocking endless opportunities for its consumers. With a focus on innovation and creativity, Airtel delivers impactful solutions and cutting-edge technologies like 5G, IoT, IQ, and Airtel Black. Committed to a 'Digital India' vision, Airtel serves millions of individuals with a subscriber base exceeding 574 million as of 2023. Role Description This is a full-time on-site Big Data Architect role located in Gurugram at Airtel. The Big Data Architect will be responsible for designing, implementing, and maintaining the company's big data infrastructure. This role involves analyzing data requirements, creating data models, and ensuring data security and integrity. Responsibilities · Must be capable of handling existing or new Apache HDFS cluster having name node, data node & edge node commissioning & decommissioning. · Work closely with data architects and analysts to design technical solutions. · Integrate and ingest data from multiple source systems into big data environments. · Develop end-to-end data transformations and workflows, ensuring logging and recovery mechanisms. · Must able to troubleshoot spark job failures. · Design and implement batch, real-time, and near-real-time data pipelines. · Optimize Big Data transformations using Apache Spark, Hive, and Tez · Work with Data Science teams to enhance actionable insights. Ensure seamless data integration and transformation across multiple systems. Qualifications · 4+ year’s of mandatory experience with Big data · 4+ year’s mandatory experience in Apache Spark. · Proficiency in Apache Spark, Hive on Tez, and Hadoop ecosystem components. · Strong coding skills in Python & Pyspark. · Experience building reusable components or frameworks using Spark · Expertise in data ingestion from multiple sources using APIs, HDFS, and NiFi. · Solid experience working with structured, unstructured, and semi-structured data formats (Text, JSON, Avro, Parquet, ORC, etc.). · Experience with UNIX Bash scripting and databases like Postgres, MySQL and Oracle. · Ability to design, develop, and evolve fault-tolerant distributed systems. · Strong SQL skills, with expertise in Hive, Impala, Mongo and NoSQL databases. · Hands-on with Git and CI/CD tools · Experience with streaming data technologies (Kafka, Spark Streaming, Apache Flink, etc.). · Proficient with HDFS, or similar data lake technologies · Excellent problem-solving skills — you will be evaluated through coding rounds Show more Show less
Posted 3 weeks ago
0 years
0 Lacs
Pune, Maharashtra, India
On-site
Our Purpose Mastercard powers economies and empowers people in 200+ countries and territories worldwide. Together with our customers, we’re helping build a sustainable economy where everyone can prosper. We support a wide range of digital payments choices, making transactions secure, simple, smart and accessible. Our technology and innovation, partnerships and networks combine to deliver a unique set of products and services that help people, businesses and governments realize their greatest potential. Title And Summary Lead Product Manager- Technical Who is Mastercard? Mastercard is a global technology company in the payments industry. Our mission is to connect and power an inclusive, digital economy that benefits everyone, everywhere by making transactions safe, simple, smart, and accessible. Using secure data and networks, partnerships and passion, our innovations and solutions help individuals, financial institutions, governments, and businesses realize their greatest potential. Our decency quotient, or DQ, drives our culture and everything we do inside and outside of our company. With connections across more than 210 countries and territories, we are building a sustainable world that unlocks priceless possibilities for all. Overview: We are looking for an Lead Product Manager - Technical to join a to drive our strategy forward by consistently innovating and problem-solving. The ideal candidate is passionate about technology, highly motivated, intellectually curious, analytical, and possesses an entrepreneurial mindset. the PVS Identity Solutions team. Role Provide technical leadership for new major initiatives Deliver innovative, cost-effective solutions which align to enterprise standards Be an integrated part of an Agile engineering team, working interactively with software engineer leads, architects, testing engineers, from the beginning of the development cycle Help ensure functionality delivered in each release is fully tested end to end Manage multiple priorities and tasks in a dynamic work environment. Identifies issues that will keep the platform features from delivering on time and/or with the desired requirements and communicates to leadership Works with internal teams and customer service to identify, classify, and prioritize feature-level customer issues Coordinates internal forums to collect and identify feature-level development opportunities Owns and manages product documentation; enables self-service support and/or works to reduce overhead Identifies feature risks from business and customer feedback and in-depth analysis of operational performance; shares with senior leadership Digests business customer requirements (user stories, use cases) and platform requirements for a platform feature set Determines release goals for the platform and prioritizes assigned features according to business and platform value, adjusting throughout implementation as needed Reviews product demo with the development team against acceptance criteria for the feature set All About You Bachelor’s degree in computer science or equivalent work experience with hands on technical and quality engineering skills Excellent technical acumen, strong organizational and problem-solving skills with great attention to detail, critical thinking, solid communication, and proven leadership skills Solid leadership and mentoring skills with the ability to drive change Knowledge Java, SQLs, APIs (REST/SOAP), code reviews, scanning tools and configuration, and branching techniques Experience with application monitoring tools such as Dynatrace and Splunk Experience with Chaos, software security, and crypto testing practices Experience with DevOps practices (continuous integration and delivery, and tools such as Jenkins) Nice to have knowledge or prior experience with any of the following Orchestration with Apache Nifi, Apache Airflow Understanding about Microservices architecture. Take the time to fully learn the functionality, architecture, dependencies, and runtime properties of the systems supporting your platform products. This includes the business requirements and associated use cases, Mastercard customer's experience, Mastercard's back office systems, the technical stack (application/service architecture), interfaces and associated data flows, dependent applications/services, runtime operations (i.e. trouble management/associated support strategies), and maintenance. Understands and can explain the business context and the associated customer use cases Proficient at grooming user stories, setting entrance/exit criteria and prioritizing a platform product backlog Understands the technologies supporting the platform product and are able to hold your own in debates with other PM-Ts, TPMs, SDEs, and SPMs Recognize discordant views and take part in constructive dialog to resolve them Verbal and written communication is clear and concise Improve team processes that accelerate delivery, drive innovation, lower costs, and improve quality Corporate Security Responsibility Every Person Working For, Or On Behalf Of, Mastercard Is Responsible For Information Security. All Activities Involving Access To Mastercard Assets, Information, And Networks Comes With An Inherent Risk To The Organization And Therefore, It Is Expected That The Successful Candidate For This Position Must: Abide by Mastercard’s security policies and practices; Ensure the confidentiality and integrity of the information being accessed; Report any suspected information security violation or breach, and Complete all periodic mandatory security trainings in accordance with Mastercard’s guidelines. Corporate Security Responsibility All Activities Involving Access To Mastercard Assets, Information, And Networks Comes With An Inherent Risk To The Organization And, Therefore, It Is Expected That Every Person Working For, Or On Behalf Of, Mastercard Is Responsible For Information Security And Must: Abide by Mastercard’s security policies and practices; Ensure the confidentiality and integrity of the information being accessed; Report any suspected information security violation or breach, and Complete all periodic mandatory security trainings in accordance with Mastercard’s guidelines. R-249228 Show more Show less
Posted 3 weeks ago
0 years
0 Lacs
Pune, Maharashtra, India
On-site
Our Purpose Mastercard powers economies and empowers people in 200+ countries and territories worldwide. Together with our customers, we’re helping build a sustainable economy where everyone can prosper. We support a wide range of digital payments choices, making transactions secure, simple, smart and accessible. Our technology and innovation, partnerships and networks combine to deliver a unique set of products and services that help people, businesses and governments realize their greatest potential. Title And Summary Lead Software Test Engineer (Automation Tester) Lead SDET Who is Mastercard? Mastercard is a global technology company in the payments industry. Our mission is to connect and power an inclusive, digital economy that benefits everyone, everywhere by making transactions safe, simple, smart, and accessible. Using secure data and networks, partnerships and passion, our innovations and solutions help individuals, financial institutions, governments, and businesses realize their greatest potential. Our decency quotient, or DQ, drives our culture and everything we do inside and outside of our company. With connections across more than 210 countries and territories, we are building a sustainable world that unlocks priceless possibilities for all. Job Overview: As part of an exciting, fast paced environment developing payment authentication and security solutions, this position will provide technical leadership and expertise within the development lifecycle for ecommerce payment authentication platform under Authentication program for Digital Overview: We are looking for an Automation Tester to join the PVS Identity Solutions team. This is a pivotal role, responsible for QA, Loading Testing and Automation various data-driven pipelines. The position involves managing testing infrastructure for Functional test, Automation and co-ordination of testing that spans multiple programs and projects. The ideal candidate will have experience working with large-scale data and automation testing of Java, Cloud Native application/services. Position will lead the development and maintenance of automated testing frameworks Provide technical leadership for new major initiatives Deliver innovative, cost-effective solutions which align to enterprise standards Drive the reduction of time spent testing Work to minimize manual testing by identifying high-ROI test cases and automating them Be an integrated part of an Agile engineering team, working interactively with software engineer leads, architects, testing engineers, and product managers from the beginning of the development cycle Help ensure functionality delivered in each release is fully tested end to end Manage multiple priorities and tasks in a dynamic work environment All About You Bachelor’s degree in computer science or equivalent work experience with hands on technical and quality engineering skills Expertise in testing methods, standards, and conventions including automation and test case creation Excellent technical acumen, strong organizational and problem-solving skills with great attention to detail, critical thinking, solid communication, and proven leadership skills Solid leadership and mentoring skills with the ability to drive change Experience in testing ETL processes Experience in Testing Automation Frameworks and agile Knowledge of Python/Hadoop/Spark, Java, SQLs, APIs (REST/SOAP), code reviews, scanning tools and configuration, and branching techniques Experience with application monitoring tools such as Dynatrace and Splunk Experience with Chaos, software security, and crypto testing practices Experience with Performance Testing Experience with DevOps practices (continuous integration and delivery, and tools such as Jenkins) Nice to have knowledge or prior experience with any of the following Apache Kafka, Apache Spark with Scala Orchestration with Apache Nifi, Apache Airflow Microservices architecture Build tools like Jenkins Good to have - Mobile Testing skills Working with large data sets with terabytes of data Corporate Security Responsibility Every Person Working For, Or On Behalf Of, Mastercard Is Responsible For Information Security. All Activities Involving Access To Mastercard Assets, Information, And Networks Comes With An Inherent Risk To The Organization And Therefore, It Is Expected That The Successful Candidate For This Position Must: Abide by Mastercard’s security policies and practices; Ensure the confidentiality and integrity of the information being accessed; Report any suspected information security violation or breach, and Complete all periodic mandatory security trainings in accordance with Mastercard’s guidelines. Corporate Security Responsibility All Activities Involving Access To Mastercard Assets, Information, And Networks Comes With An Inherent Risk To The Organization And, Therefore, It Is Expected That Every Person Working For, Or On Behalf Of, Mastercard Is Responsible For Information Security And Must: Abide by Mastercard’s security policies and practices; Ensure the confidentiality and integrity of the information being accessed; Report any suspected information security violation or breach, and Complete all periodic mandatory security trainings in accordance with Mastercard’s guidelines. R-249227 Show more Show less
Posted 3 weeks ago
0 years
0 Lacs
Kochi, Kerala, India
On-site
Job Description: Expert level skills in Java 11+ Spring boot(spring cloud, spring security, jpa) Strong knowledge of RESTful web services High proficiency with development tools and workflows (Junit, Maven, continuous workflow, etc.) Log4J SSO (single sign-on implementation), Maven, Junit, Sonar Experience in SQL, Mongo DB. Good to have: Apache NiFi, groovy scripting Requirements Knowledge of software development lifecycle or IT infrastructure. Good problem-solving and analytical skills. Strong communication and teamwork abilities. Willingness to learn and adapt in a fast-paced environment. Benefits Comprehensive Medical Insurance – Covers the entire family, ensuring health and well-being for employees and their loved ones. Hybrid Work Culture – Only 3 days in-office per week, offering flexibility and reducing commute stress. Healthy Work-Life Balance – Prioritizes employee well-being, allowing time for personal and professional growth. Friendly Working Environment – A supportive and collaborative workplace culture that fosters positivity and teamwork. No Variable Pay Deductions – Guaranteed full compensation with no unpredictable pay cuts, ensuring financial stability. Certification Reimbursement - Company will reimburse the money required to take certificates based on project demands Show more Show less
Posted 3 weeks ago
10.0 years
0 Lacs
India
On-site
Embark on an exciting journey into the realm of data analytics with 3Pillar! We extend an invitation for you to join our team and gear up for a thrilling adventure. As a Snowflake Data Architect you will be at the heart of the organization and support our clients to take control of their data and get value out of it by defining a reference architecture for our customers. This means that you will work closely with business leaders and information management teams to define and implement a roadmap on data management, business intelligence or analytics solutions.. If your passion for data analytics solutions that make a real-world impact, consider this your pass to the captivating world of Data Science and Engineering! 🌍🔥 Relevant Experience: 10+ years in Data Practice Must have Skills: Snowflake, Data Architecture, Engineering, Governance, and Cloud services Responsibilities Assessments of existing data components, Performing POCs, Consulting to the stakeholders Lead the migration to Snowflake In a strong client-facing role, proposing, advocating, leading implementation of end to end solutions to an enterprise's data specific business problems, and taking care of data collection, extraction, integration, cleansing, enriching and data visualization. Ability to design large data platforms to enable Data Engineers, Analysts & scientists Strong exposure to different Data architectures, data lake & data warehouse, including migrations, rearchitect and platform modernization Define tools & technologies to develop automated data pipelines, write ETL processes, develop dashboard & report and create insights Continually reassess current state for alignment with architecture goals, best practices and business needs DB modeling, deciding best data storage, creating data flow diagrams, maintaining related documentation Taking care of performance, reliability, reusability, resilience, scalability, security, privacy & data governance while designing a data architecture Apply or recommend best practices in architecture, coding, API integration, CI/CD pipelines Coordinate with data scientists, analysts, and other stakeholders for data-related needs Help the Data Science & Analytics Practice grow by mentoring junior Practice members, leading initiatives, leading Data Practice Offerings Provide thought leadership by representing the Practice / Organization on internal / external platforms Qualification: Translate business requirements into data requests, reports and dashboards. Strong Database & modeling concepts with exposure to SQL & NoSQL Databases Strong data architecture patterns & principles, ability to design secure & scalable data lakes, data warehouse, data hubs, and other event-driven architectures Expertise in designing and writing ETL processes. Strong experience to Snowflake, and its components. Knowledge of Master Data management and related tools Strong exposure to data security and privacy regulations (GDPR, HIPAA) and best practices Skilled in ensuring data accuracy, consistency, and quality Experience of AWS services viz., AWS S3, Redshift, Lambda, DynamoDB, EMR, Glue, Lake formation, Athena, Quicksight, RDS, Kinesis, Managed Kafka, API Gateway, CloudWatch AWS S3, Redshift, Lambda, DynamoDB, EMR, Glue, Lake formation, Athena, Quicksight, RDS, Kinesis, Managed Kafka, Elasticsearch and Elastic Cache, API Gateway, CloudWatch Ability to implement data validation processes and establish data quality standards. Experience in Linux, and scripting Proficiency in data visualization tools like Tableau, Power BI or similar to create meaningful insights Additional Experience Desired: Experience working with data ingestion tools such as Fivetran, stitch, or Matillion AWS IOT solutions Apache NiFi, Talend, Informatica Knowledge of GCP Data services Exposure to AI / ML technologies Show more Show less
Posted 3 weeks ago
4.0 years
0 Lacs
Pune, Maharashtra, India
On-site
Our Purpose Mastercard powers economies and empowers people in 200+ countries and territories worldwide. Together with our customers, we’re helping build a sustainable economy where everyone can prosper. We support a wide range of digital payments choices, making transactions secure, simple, smart and accessible. Our technology and innovation, partnerships and networks combine to deliver a unique set of products and services that help people, businesses and governments realize their greatest potential. Title And Summary Lead Data Engineer – Transfer Solutions Who is Mastercard? Mastercard is a global technology company in the payments industry. We work to connect and power an inclusive, digital economy that benefits everyone, everywhere by making transactions safe, simple, smart, and accessible. Our mission is to connect and power an inclusive, digital economy that benefits everyone, everywhere by making transactions safe, simple, smart, and accessible. Using secure data and networks, partnerships and passion, our innovations and solutions help individuals, financial institutions, governments, and businesses realize their greatest potential. Our decency quotient, or DQ, drives our culture and everything we do inside and outside of our company. With connections across more than 210 countries and territories, we are building a sustainable world that unlocks priceless possibilities for all. Team Overview Transfer Solutions is responsible for driving Mastercard’s expansion into new payment flows such as Disbursements & Remittances. The team is working on creating a market-leading money transfer proposition, Mastercard Move, to power the next generation of payments between people and businesses, whether money is moving domestically or across borders, by delivering the ability to pay and get paid with choice, transparency, and flexibility. The Product & Engineering teams within Transfer Solutions are responsible for designing, developing, launching, and maintaining products and services designed to capture these flows from a wide range of Customer segments. By addressing Customer pain points for domestic and cross-border transfers, the goal is to scale Mastercard’s Disbursements & Remittances business, trebling volume over the next 4 years. If you would like to be part of a global, cross-functional team delivering a highly visible, strategically important initiative in an agile way, this role will be attractive to you. Do you like to be part of a team that is creating and executing strategic initiatives centered around digital payments? Do you look forward to developing and engaging with high performant diverse teams around the globe? Would you like to be part of a highly visible, strategically important global engineering organization? The Role We are looking for an experienced Data Engineer to design and develop advanced data migration pipelines from traditional OLTP databases (e.g., Oracle) to modern big data platforms such as Cloudera and Databricks. The ideal candidate will possess expertise in technologies such as Python, Java, Spark, and NiFi, along with a proven track record in managing data pipelines for tasks including initial snapshot loading, building Change Data Capture (CDC) pipelines, exception management, reconciliation, data security, and retention. This role also demands proficiency in data modeling, cataloging, taxonomy creation, and ensuring robust data provenance and lineage to support governance and compliance requirements. Key Responsibilities Design, develop, and optimize data migration pipelines from OLTP databases like Oracle to big data platforms, including Cloudera CDP/CDH and Databricks. Build scalable ETL workflows using tools like Python, Scala, Apache Spark, and Apache NiFi to support initial snapshots, CDC, exception handling, and reconciliation processes. Implement data security measures, such as encryption, access controls, and compliance with data retention policies, across all migration pipelines. Develop and maintain data models, taxonomy structures, and cataloging systems to ensure logical organization and easy accessibility of data. Establish data lineage and provenance to ensure traceability and compliance with governance frameworks. Collaborate with cross-functional teams to understand data migration requirements, ensuring high-quality and timely delivery of solutions. Monitor and troubleshoot data pipelines to ensure performance, scalability, and reliability. Stay updated on emerging technologies in data engineering and big data ecosystems, proposing improvements to existing systems and processes. Required Skills And Qualifications 10+ years of experience in data engineering, with at least 2 years in a leadership or technical lead role. Proficiency in OLTP databases, particularly Oracle, and data egress techniques. Strong programming skills in Python, Scala and Java. Expertise in Apache Spark, Flink, Kafka and data integration tools like Apache NiFi. Hands-on experience with Cloudera Data Platform CDP/CDH, Apache Ozone Familiarity with cloud-based big data ecosystems such as AWS Databrick, S3, Glue etc Familiarity with patterns such as Medallion, data layers, datalake, datawarehouse, experience in building scalable ETL pipeline, optimizing data workflows, leveraging platforms to integrate transform, and store large datasets. Knowledge of data security best practices, including encryption, data masking, and role-based access control. Exceptional problem-solving and analytical abilities Strong communication and leadership skills, with the ability to navigate ambiguity and collaborate effectively across diverse teams.Optional – Awareness on regulatory compliance requirements for data handling and privacy Education: Bachelor’s or Master’s degree in Computer Science Corporate Security Responsibility All activities involving access to Mastercard assets, information, and networks comes with an inherent risk to the organization and, therefore, it is expected that every person working for, or on behalf of, Mastercard is responsible for information security and must: Abide by Mastercard’s security policies and practices; Ensure the confidentiality and integrity of the information being accessed; Report any suspected information security violation or breach, and Complete all periodic mandatory security trainings in accordance with Mastercard’s guidelines. R-233628 Show more Show less
Posted 3 weeks ago
0 years
0 Lacs
Gurgaon, Haryana, India
On-site
Our Purpose Mastercard powers economies and empowers people in 200+ countries and territories worldwide. Together with our customers, we’re helping build a sustainable economy where everyone can prosper. We support a wide range of digital payments choices, making transactions secure, simple, smart and accessible. Our technology and innovation, partnerships and networks combine to deliver a unique set of products and services that help people, businesses and governments realize their greatest potential. Title And Summary Senior Analyst, Data Strategy Overview The Data Quality team in the Data Strategy & Management organization (Chief Data Office) is responsible for developing and driving Mastercard’s core data management program and broadening its data quality efforts. This team ensures that Mastercard maintains and increases the value of Mastercard’s data assets as we enable new forms of payment, new players in the ecosystem and expand collection and use of data to support new lines of business. Enterprise Data Quality team for is responsible for ensuring data is of the highest quality and is fit for purpose to support business and analytic uses. The team works to identify and prioritize opportunities for data quality improvement, develops strategic mitigation plans and coordinates remediation activities with MC Tech and the business owner. Role Support the processes for improving and expanding merchant data, including address standardization, geocoding, incorporating new merchant data sources to assist in improvements. Assess quality issues in merchant data, transactional data, and other critical data sets. Support internal and external feedback loop to improve data submitted through the transaction data stream. Solution and present data challenges in a manner suitable for product and business understanding. Provide subject matter expertise on merchant data for the organization’s product development efforts Coordinate with MC Tech to develop DQ remediation requirements for core systems, data warehouse and other critical applications. Manage corresponding remediation projects to ensure successful implementation. Lead organization-wide awareness and communication of data quality initiatives and remediation activities, ensuring seamless implementation. Coordinate with critical vendors to manage project timelines and achieve quality deliverables Develop and implement with MC Tech data pipelines that extracts, transforms, and loads data into an information product that supports organizational strategic goals Implement new technologies and frameworks as per project requirements All About You Hands-on experience managing technology projects with demonstrated ability to understand complex data and technology initiatives Ability to lead and influence others to advance deliverables Understanding of emerging technologies including but not limited to, cloud architecture, machine learning/AI and Big Data infrastructure Data architecture experience and experience in building data models. Experience deploying and working with big data technologies like Hadoop, Spark, and Sqoop. Experience with streaming frameworks like Kafka and Axon and pipelines like Nifi, Proficient in OO programming (Python Java/Springboot/J2EE, and Scala) Experience with the Hadoop Ecosystem (HDFS, Yarn, MapReduce, Spark, Hive, Impala), Experience with Linux, Unix command line, Unix Shell Scripting, SQL and any Scripting language Experience with data visualization tools such as Tableau, Domo, and/or PowerBI is a plus. Experience presenting data findings in a readable and insight driven format. Experience building support decks. Corporate Security Responsibility All activities involving access to Mastercard assets, information, and networks comes with an inherent risk to the organization and, therefore, it is expected that every person working for, or on behalf of, Mastercard is responsible for information security and must: Abide by Mastercard’s security policies and practices; Ensure the confidentiality and integrity of the information being accessed; Report any suspected information security violation or breach, and Complete all periodic mandatory security trainings in accordance with Mastercard’s guidelines. R-249888 Show more Show less
Posted 3 weeks ago
4.0 years
0 Lacs
Pune, Maharashtra, India
On-site
Our Purpose Mastercard powers economies and empowers people in 200+ countries and territories worldwide. Together with our customers, we’re helping build a sustainable economy where everyone can prosper. We support a wide range of digital payments choices, making transactions secure, simple, smart and accessible. Our technology and innovation, partnerships and networks combine to deliver a unique set of products and services that help people, businesses and governments realize their greatest potential. Title And Summary Data Engineer-2 Overview We are the global technology company behind the world’s fastest payments processing network. We are a vehicle for commerce, a connection to financial systems for the previously excluded, a technology innovation lab, and the home of Priceless®. We ensure every employee has the opportunity to be a part of something bigger and to change lives. We believe as our company grows, so should you. We believe in connecting everyone to endless, priceless possibilities. Our Team Within Mastercard – Services The Services org is a key differentiator for Mastercard, providing the cutting-edge services that are used by some of the world's largest organizations to make multi-million dollar decisions and grow their businesses. Focused on thinking big and scaling fast around the globe, this agile team is responsible for end-to-end solutions for a diverse global customer base. Centered on data-driven technologies and innovation, these services include payments-focused consulting, loyalty and marketing programs, business Test & Learn experimentation, and data-driven information and risk management services. Data Analytics And AI Solutions (DAAI) Program Within the D&S Technology Team, the DAAI program is a relatively new program that is comprised of a rich set of products that provide accurate perspectives on Portfolio Optimization, and Ad Insights. Currently, we are enhancing our customer experience with new user interfaces, moving to API and web application-based data publishing to allow for seamless integration in other Mastercard products and externally, utilizing new data sets and algorithms to further analytic capabilities, and generating scalable big data processes. We are looking for an innovative software engineering lead who will lead the technical design and development of an Analytic Foundation. The Analytic Foundation is a suite of individually commercialized analytical capabilities (think prediction as a service, matching as a service or forecasting as a service) that also includes a comprehensive data platform. These services will be offered through a series of APIs that deliver data and insights from various points along a central data store. This individual will partner closely with other areas of the business to build and enhance solutions that drive value for our customers. Engineers work in small, flexible teams. Every team member contributes to designing, building, and testing features. The range of work you will encounter varies from building intuitive, responsive UIs to designing backend data models, architecting data flows, and beyond. There are no rigid organizational structures, and each team uses processes that work best for its members and projects. Here are a few examples of products in our space: Portfolio Optimizer (PO) is a solution that leverages Mastercard’s data assets and analytics to allow issuers to identify and increase revenue opportunities within their credit and debit portfolios. Ad Insights uses anonymized and aggregated transaction insights to offer targeting segments that have high likelihood to make purchases within a category to allow for more effective campaign planning and activation. Help found a new, fast-growing engineering team! Position Responsibilities As a Data Engineer within DAAI, you will: Play a large role in the implementation of complex features Push the boundaries of analytics and powerful, scalable applications Build and maintain analytics and data models to enable performant and scalable products Ensure a high-quality code base by writing and reviewing performant, well-tested code Mentor junior engineers and teammates Drive innovative improvements to team development processes Partner with Product Managers and Customer Experience Designers to develop a deep understanding of users and use cases and apply that knowledge to scoping and building new modules and features Collaborate across teams with exceptional peers who are passionate about what they do Ideal Candidate Qualifications 4+ years of data engineering experience in an agile production environment Experience leading the design and implementation of large, complex features in full-stack applications Ability to easily move between business, data management, and technical teams; ability to quickly intuit the business use case and identify technical solutions to enable it Experience leveraging open source tools, predictive analytics, machine learning, Advanced Statistics, and other data techniques to perform analyses High proficiency in using Python or Scala, Spark, Hadoop platforms & tools (Hive, Impala, Airflow, NiFi, Scoop), SQL to build Big Data products & platforms Experience in building and deploying production-level data-driven applications and data processing workflows/pipelines and/or implementing machine learning systems at scale in Java, Scala, or Python and deliver analytics involving all phases like data ingestion, feature engineering, modeling, tuning, evaluating, monitoring, and presenting Experience in cloud technologies like Databricks/AWS/Azure Strong technologist with proven track record of learning new technologies and frameworks Customer-centric development approach Passion for analytical / quantitative problem solving Experience identifying and implementing technical improvements to development processes Collaboration skills with experience working with people across roles and geographies Motivation, creativity, self-direction, and desire to thrive on small project teams Superior academic record with a degree in Computer Science or related technical field Strong written and verbal English communication skills Corporate Security Responsibility All activities involving access to Mastercard assets, information, and networks comes with an inherent risk to the organization and, therefore, it is expected that every person working for, or on behalf of, Mastercard is responsible for information security and must: Abide by Mastercard’s security policies and practices; Ensure the confidentiality and integrity of the information being accessed; Report any suspected information security violation or breach, and Complete all periodic mandatory security trainings in accordance with Mastercard’s guidelines. R-246504 Show more Show less
Posted 3 weeks ago
Upload Resume
Drag or click to upload
Your data is secure with us, protected by advanced encryption.
Browse through a variety of job opportunities tailored to your skills and preferences. Filter by location, experience, salary, and more to find your perfect fit.
We have sent an OTP to your contact. Please enter it below to verify.
Accenture
36723 Jobs | Dublin
Wipro
11788 Jobs | Bengaluru
EY
8277 Jobs | London
IBM
6362 Jobs | Armonk
Amazon
6322 Jobs | Seattle,WA
Oracle
5543 Jobs | Redwood City
Capgemini
5131 Jobs | Paris,France
Uplers
4724 Jobs | Ahmedabad
Infosys
4329 Jobs | Bangalore,Karnataka
Accenture in India
4290 Jobs | Dublin 2