Get alerts for new jobs matching your selected skills, preferred locations, and experience range. Manage Job Alerts
15.0 - 20.0 years
10 - 14 Lacs
Bengaluru
Work from Office
About The Role Project Role : Application Lead Project Role Description : Lead the effort to design, build and configure applications, acting as the primary point of contact. Must have skills : AIX System Administration Good to have skills : NAMinimum 7.5 year(s) of experience is required Educational Qualification : 15 years full time educationYour Role and ResponsibilitiesAs AIX Administrator, you are responsible for installation, implementation, customization, operation, recovery and performance tuning with proven knowledge of the fundamental concepts and technologies associated with AIX Operating Systems.Responsibilities:Installation, configuration and troubleshooting of AIX & Unix Operating systemClustering and High availability managementUNIX, AIX Server Support:Designing, troubleshooting and storage implementation, articulate standard methodologies for during implementationSystems running on UNIX / AIX platforms and OS clustering, partitioning and virtualizationHandle day-to-day UNIX-AIX operating system installation, migration and break-fix supportSAN S/W and storage administration and integration with operating systemsRaising and working with PMR teamInstalling, configuring and maintaining IBM AIX and Unix ServersInstallation and configuration of VirtualizationInstallation and configuration of cluster environment using HACMPConfiguration and administration of Logical volume managerPatch and Package administrationWriting shell scripts to accomplish day to day system administration taskConfiguring and supporting Domains, LPARs, DLPARsAdministrator & configure various FS like JFS, VxFS, Pseudo FSTroubleshooting Hardware and Operating system related issueCapacity planning and fine tune system for optimal performanceUnderstanding of SAN and NAS storage.Administration of NIS or LDAP environmentRequired Technical and Professional ExpertiseMinimum 7 years of experience in IT Industry Unix AdministrationAIX administration and Linux administration (Redhat/Suse/Ubuntu), Automation experience in OS Patching and upgradesAbility to work independently with Vendor to resolve issues (OS and Hardware)Proven working experience in installing, configuring middleware and troubleshooting Linux/AIX based environmentKnowledge on SAN and NAS storage Experience with Physical, Virtual and containerized environmentPreferred Technical and Professional ExpertiseProactive monitoring and capacity planning experienceKnowledge on Ansible or other automation tool is must Scripting KnowledgeBash/Python/Perl to automate day to day activitiesWilling to adopt new technology Expertise in Cluster configuration and troubleshootingWorking knowledge of Incident and Change Management Qualification 15 years full time education
Posted 3 weeks ago
3.0 - 8.0 years
4 - 8 Lacs
Gurugram
Work from Office
About The Role Project Role : Engineering Services Practitioner Project Role Description : Assist with end-to-end engineering services to develop technical engineering solutions to solve problems and achieve business objectives. Solve engineering problems and achieve business objectives using scientific, socio-economic, technical knowledge and practical experience. Work across structural and stress design, qualification, configuration and technical management. Must have skills : 5G Wireless Networks & Technologies Good to have skills : AWS Core Infrastructure, Red Hat Cloud Architecture and DesignMinimum 7.5 year(s) of experience is required Educational Qualification : 15 years full time educationJob Title:5G Core Network Specialist Team Lead / Associate Manager (Nokia/Ericsson)Location:Delhi-NCR/ BangaloreJob Type:[Full-Time]Department:Core Network Engineering / Mobile Network OperationsTeam Size:410 Engineers (depending on project scope) Summary :We are seeking a dynamic and technically proficient 5G Core Network Specialist Team Lead / Associate Manager to lead a team of engineers in the design, deployment, and optimization of Nokia or Ericsson 5G Core Network solutions. This leadership role requires a blend of deep technical expertise, project management capabilities, and people leadership to drive the successful delivery of 5G Core initiatives across SA and NSA deployments.Roles and Responsibilities:Technical Leadership & StrategyLead the end-to-end design, integration, and optimization of 5G Core Network components (AMF, SMF, UPF, AUSF, UDM, NRF, PCF, NSSF, NEF).Define and enforce best practices for 5G Core deployment, configuration, and lifecycle management.Provide technical direction on Nokia or Ericsson 5G Core platforms, ensuring alignment with 3GPP standards and business goals.Team & Project ManagementManage a team of core network engineers, assigning tasks, mentoring junior staff, and conducting performance reviews.Coordinate with cross-functional teams including RAN, Transport, IT, and Security to ensure seamless integration.Operations & OptimizationSupervise the monitoring, troubleshooting, and resolution of complex network issues.Lead root cause analysis (RCA) for major incidents and implement preventive measures.Drive continuous improvement in network performance, reliability, and automation.Innovation & AutomationPromote the adoption of cloud-native technologies, CI/CD pipelines, and DevOps practices in 5G Core operations.Guide the development of automation scripts and tools for provisioning, monitoring, and reporting.Technical experience and Professional attributes:1015 years of experience in mobile core networks, with 3+ years in 5G Core (SA/NSA).Proven leadership experience managing technical teams or leading large-scale 5G Core projects.Hands-on expertise with Nokia or Ericsson 5G Core Network components and management tools.Strong understanding of 3GPP Release 15/16/17 standards and service-based architecture (SBA).Proficiency in protocols:SCTP, HTTP/2, PFCP, GTP, Diameter, TCP/IP, TLS.Experience with NFV/SDN, cloud-native platforms (OpenStack, Kubernetes), and containerized network functions (CNFs).Preferred Skills & Certifications:Nokia NSP/5G Certification or Ericsson Certified Expert 5G Core.Certifications in Project Management (PMP, PRINCE2) or Agile/Scrum methodologies.Familiarity with CI/CD tools (Git, Jenkins), automation frameworks (Ansible, Python), and monitoring tools (Prometheus, Grafana).Knowledge of IMS, VoNR, VoLTE, and interworking with legacy EPC/CS networks. Educational Qualifications:15 Years of Education, Bachelors or Masters degree in Telecommunications, Electronics, Computer Science, or related field. Additional Information:Team Player Qualification 15 years full time education
Posted 3 weeks ago
5.0 - 7.0 years
12 - 16 Lacs
Bengaluru
Work from Office
About the Role As an SRE (5 to 7 years) (Big Data) Engineer at PhonePe, you will be responsible for ensuring the stability, scalability, and performance of distributed systems operating at scale. You will collaborate with development, infrastructure, and data teams to automate operations, reduce manual efforts, handle incidents, and continuously improve system reliability. This role requires strong problem-solving skills, operational ownership, and a proactive approach to mentoring and driving engineering excellence. Roles and Responsibilities Ensure the ongoing stability, scalability, and performance of PhonePes Hadoop ecosystem and associated services. Manage and administer Hadoop infrastructure including HDFS, HBase, Hive, Pig, Airflow, YARN, Ranger, Kafka, Pinot, and Druid. Automate BAU operations through scripting and tool development. Perform capacity planning, system tuning, and performance optimization. Set-up, configure, and manage Nginx in high-traffic environments. Administration and troubleshooting of Linux + Bigdata systems, including networking (IP, Iptables, IPsec). Handle on-call responsibilities, investigate incidents, perform root cause analysis, and implement mitigation strategies. Collaborate with infrastructure, network, database, and BI teams to ensure data availability and quality. Apply system updates, patches, and manage version upgrades in coordination with security teams. Build tools and services to improve observability, debuggability, and supportability. Participate in Kerberos and LDAP administration. Experience in capacity planning and performance tuning of Hadoop clusters. Work with configuration management and deployment tools like Puppet, Chef, Salt, or Ansible. Skills Required Minimum 1 year of Linux/Unix system administration experience. Over 4 years of hands-on experience in Hadoop administration. Minimum 1 years of experience managing infrastructure on public cloud platforms like AWS, Azure, or GCP (optional ) . Strong understanding of networking, open-source tools, and IT operations. Proficient in scripting and programming (Perl, Golang, or Python). Hands-on experience with maintaining and managing the Hadoop ecosystem components like HDFS, Yarn, Hbase, Kafka . Strong operational knowledge in systems (CPU, memory, storage, OS-level troubleshooting). Experience in administering and tuning relational and NoSQL databases. Experience in configuring and managing Nginx in production environments. Excellent communication and collaboration skills. Good to Have Experience designing and maintaining Airflow DAGs to automate scalable and efficient workflows. Experience in ELK stack administration. Familiarity with monitoring tools like Grafana, Loki, Prometheus, and OpenTSDB. Exposure to security protocols and tools (Kerberos, LDAP). Familiarity with distributed systems like elasticsearch or similar high-scale environments. PhonePe Full Time Employee Benefits (Not applicable for Intern or Contract Roles) Insurance Benefits - Medical Insurance, Critical Illness Insurance, Accidental Insurance, Life Insurance Wellness Program - Employee Assistance Program, Onsite Medical Center, Emergency Support System Parental Support - Maternity Benefit, Paternity Benefit Program, Adoption Assistance Program, Day-care Support Program Mobility Benefits - Relocation benefits, Transfer Support Policy, Travel Policy Retirement Benefits - Employee PF Contribution, Flexible PF Contribution, Gratuity, NPS, Leave Encashment Other Benefits - Higher Education Assistance, Car Lease, Salary Advance Policy Working at PhonePe is a rewarding experience! Great people, a work environment that thrives on creativity, the opportunity to take on roles beyond a defined job description are just some of the reasons you should work with us. Read more about PhonePe on our blog. Life at PhonePe PhonePe in the news
Posted 3 weeks ago
7.0 - 12.0 years
40 - 45 Lacs
Pune
Work from Office
: Job Title - Data Platform Engineer - Tech Lead Location - Pune, India Role Description DB Technology is a global team of tech specialists, spread across multiple trading hubs and tech centers. We have a strong focus on promoting technical excellence our engineers work at the forefront of financial services innovation using cutting-edge technologies. DB Pune location plays a prominent role in our global network of tech centers, it is well recognized for its engineering culture and strong drive to innovate. We are committed to building a diverse workforce and to creating excellent opportunities for talented engineers and technologists. Our tech teams and business units use agile ways of working to create best solutions for the financial markets. CB Data Services and Data Platform We are seeking an experienced Software Engineer with strong leadership skills to join our dynamic tech team. In this role, you will lead a group of engineers working on cutting-edge technologies in Hadoop, Big Data, GCP, Terraform, Big Query, Data Proc and data management. You will be responsible for overseeing the development of robust data pipelines, ensuring data quality, and implementing efficient data management solutions. Your leadership will be critical in driving innovation, ensuring high standards in data infrastructure, and mentoring team members. Your responsibilities will include working closely with data engineers, analysts, cross-functional teams, and other stakeholders to ensure that our data platform meets the needs of our organization and supports our data-driven initiatives. Join us in building and scaling our tech solutions including hybrid data platform to unlock new insights and drive business growth. If you are passionate about data engineering, we want to hear from you! Deutsche Banks Corporate Bank division is a leading provider of cash management, trade finance and securities finance. We complete green-field projects that deliver the best Corporate Bank - Securities Services products in the world. Our team is diverse, international, and driven by shared focus on clean code and valued delivery. At every level, agile minds are rewarded with competitive pay, support, and opportunities to excel.You will work as part of a cross-functional agile delivery team. You will bring an innovative approach to software development, focusing on using the latest technologies and practices, as part of a relentless focus on business value. You will be someone who sees engineering as team activity, with a predisposition to open code, open discussion and creating a supportive, collaborative environment. You will be ready to contribute to all stages of software delivery, from initial analysis right through to production support. What well offer you 100% reimbursement under childcare assistance benefit (gender neutral) Sponsorship for Industry relevant certifications and education Accident and Term life Insurance Your key responsibilities Technical Leadership: Lead a cross-functional team of engineers in the design, development, and implementation of on prem and cloud-based data solutions. Provide hands-on technical guidance and mentorship to team members, fostering a culture of continuous learning and improvement. Collaborate with product management and stakeholders to define technical requirements and establish delivery priorities. . Architectural and Design Capabilities: Architect and implement scalable, efficient, and reliable data management solutions to support complex data workflows and analytics. Evaluate and recommend tools, technologies, and best practices to enhance the data platform. Drive the adoption of microservices, containerization, and serverless architectures within the team. Quality Assurance: Establish and enforce best practices in coding, testing, and deployment to maintain high-quality code standards. Oversee code reviews and provide constructive feedback to promote code quality and team growth. Your skills and experience Technical Skills: Bachelor's or Masters degree in Computer Science, Engineering, or related field. 7+ years of experience in software engineering, with a focus on Big Data and GCP technologies such as Hadoop, PySpark, Terraform, BigQuery, DataProc and data management. Proven experience in leading software engineering teams, with a focus on mentorship, guidance, and team growth. Strong expertise in designing and implementing data pipelines, including ETL processes and real-time data processing. Hands-on experience with Hadoop ecosystem tools such as HDFS, MapReduce, Hive, Pig, and Spark. Hands on experience with cloud platform particularly Google Cloud Platform (GCP), and its data management services (e.g., Terraform, BigQuery, Cloud Dataflow, Cloud Dataproc, Cloud Storage). Solid understanding of data quality management and best practices for ensuring data integrity. Familiarity with containerization and orchestration tools such as Docker and Kubernetes is a plus. Excellent problem-solving skills and the ability to troubleshoot complex systems. Strong communication skills and the ability to collaborate with both technical and non-technical stakeholders Leadership Abilities: Proven experience in leading technical teams, with a track record of delivering complex projects on time and within scope. Ability to inspire and motivate team members, promoting a collaborative and innovative work environment. Strong problem-solving skills and the ability to make data-driven decisions under pressure. Excellent communication and collaboration skills. Proactive mindset, attention to details, and constant desire to improve and innovate. How well support you About us and our teams Please visit our company website for further information: https://www.db.com/company/company.htm We strive for a culture in which we are empowered to excel together every day. This includes acting responsibly, thinking commercially, taking initiative and working collaboratively. Together we share and celebrate the successes of our people. Together we are Deutsche Bank Group. We welcome applications from all people and promote a positive, fair and inclusive work environment.
Posted 3 weeks ago
5.0 - 10.0 years
22 - 27 Lacs
Navi Mumbai
Work from Office
As Architect at IBM you will harness the power of data to unveil captivating stories and intricate patterns. You'll contribute to data gathering, storage, and both batch and real-time processing. Collaborating closely with diverse teams, you'll play an important role in deciding the most suitable data management systems and identifying the crucial data required for insightful analysis. As a Data Architect, you'll tackle obstacles related to database integration and untangle complex, unstructured data sets. In this role, your responsibilities may include Implementing and validating predictive models as well as creating and maintain statistical models with a focus on big data, incorporating a variety of statistical and machine learning techniques Designing and implementing various enterprise seach applications such as Elasticsearch and Splunk for client requirements Work in an Agile, collaborative environment, partnering with other scientists, engineers, consultants and database administrators of all backgrounds and disciplines to bring analytical rigor and statistical methods to the challenges of predicting behaviours Required education Bachelor's Degree Preferred education Master's Degree Required technical and professional expertise Strong understanding data lake approaches, industry standards and industry best practices. Detail level understanding of HADOOP Framework, Ecosystem, MapReduce, and Data on Containers (data in OpenShift). Applies individual experiences / competency and IBM architecting structure thinking model to analyzing client IT systems. Experience with relational SQL, Big Data etc Experienced with Cloud native platforms such as AWS, Azure, Google, IBM Cloud or Cloud Native data platforms like Snowflake Preferred technical and professional experience Knowledge of MS-Azure Cloud Experience in Unix shell scripting and python
Posted 3 weeks ago
3.0 - 8.0 years
5 - 8 Lacs
Mumbai
Work from Office
Role Overview: Seeking an experienced Apache Airflow specialist to design and manage data orchestration pipelines for batch/streaming workflows in a Cloudera environment. Key Responsibilities: * Design, schedule, and monitor DAGs for ETL/ELT pipelines * Integrate Airflow with Cloudera services and external APIs * Implement retries, alerts, logging, and failure recovery * Collaborate with data engineers and DevOps teams Required education Bachelor's Degree Preferred education Master's Degree Required technical and professional expertise Skills Required: * Experience3-8 years * Expertise in Airflow 2.x, Python, Bash * Knowledge of CI/CD for Airflow DAGs * Proven experience with Cloudera CDP, Spark/Hive-based data pipelines * Integration with Kafka, REST APIs, databases
Posted 3 weeks ago
3.0 - 7.0 years
10 - 14 Lacs
Pune
Work from Office
Developer leads the cloud application development/deployment. A developer responsibility is to lead the execution of a project by working with a senior level resource on assigned development/deployment activities and design, build, and maintain cloud environments focusing on uptime, access, control, and network security using automation and configuration management tools Required education Bachelor's Degree Preferred education Master's Degree Required technical and professional expertise Strong proficiency in Java, Spring Framework, Spring boot, RESTful APIs, excellent understanding of OOP, Design Patterns. Strong knowledge of ORM tools like Hibernate or JPA, Java based Micro-services framework, Hands on experience on Spring boot Microservices, Primary Skills: - Core Java, Spring Boot, Java2/EE, Microservices - Hadoop Ecosystem (HBase, Hive, MapReduce, HDFS, Pig, Sqoop etc) - Spark Good to have Python. Strong knowledge of micro-service logging, monitoring, debugging and testing, In-depth knowledge of relational databases (e.g., MySQL) Experience in container platforms such as Docker and Kubernetes, experience in messaging platforms such as Kafka or IBM MQ, Good understanding of Test-Driven-Development Familiar with Ant, Maven or other build automation framework, good knowledge of base UNIX commands,Experience in Concurrent design and multi-threading Preferred technical and professional experience None
Posted 3 weeks ago
2.0 - 5.0 years
14 - 17 Lacs
Pune
Work from Office
As a BigData Engineer at IBM you will harness the power of data to unveil captivating stories and intricate patterns. You'll contribute to data gathering, storage, and both batch and real-time processing. Collaborating closely with diverse teams, you'll play an important role in deciding the most suitable data management systems and identifying the crucial data required for insightful analysis. As a Data Engineer, you'll tackle obstacles related to database integration and untangle complex, unstructured data sets In this role, your responsibilities may include: As a Big Data Engineer, you will develop, maintain, evaluate, and test big data solutions. You will be involved in data engineering activities like creating pipelines/workflows for Source to Target and implementing solutions that tackle the clients needs Required education Bachelor's Degree Preferred education Master's Degree Required technical and professional expertise Big Data Developer, Hadoop, Hive, Spark, PySpark, Strong SQL. Ability to incorporate a variety of statistical and machine learning techniques. Basic understanding of Cloud (AWS,Azure, etc). Ability to use programming languages like Java, Python, Scala, etc., to build pipelines to extract and transform data from a repository to a data consumer Ability to use Extract, Transform, and Load (ETL) tools and/or data integration, or federation tools to prepare and transform data as needed. Ability to use leading edge tools such as Linux, SQL, Python, Spark, Hadoop and Java Preferred technical and professional experience Basic understanding or experience with predictive/prescriptive modeling skills You thrive on teamwork and have excellent verbal and written communication skills. Ability to communicate with internal and external clients to understand and define business needs, providing analytical solutions
Posted 3 weeks ago
3.0 - 7.0 years
10 - 14 Lacs
Bengaluru
Work from Office
As a Software Developer you'll participate in many aspects of the software development lifecycle, such as design, code implementation, testing, and support. You will create software that enables your clients' hybrid-cloud and AI journeys. Your primary responsibilities include: Comprehensive Feature Development and Issue ResolutionWorking on the end to end feature development and solving challenges faced in the implementation. Stakeholder Collaboration and Issue ResolutionCollaborate with key stakeholders, internal and external, to understand the problems, issues with the product and features and solve the issues as per SLAs defined. Continuous Learning and Technology IntegrationBeing eager to learn new technologies and implementing the same in feature development Required education Bachelor's Degree Preferred education Master's Degree Required technical and professional expertise Primary Skills: Core Java, Spring Boot, Java2/EE, Microsservices Hadoop Ecosystem (HBase, Hive, MapReduce, HDFS, Pig, Sqoop etc) Spark Good to have Python Preferred technical and professional experience None
Posted 3 weeks ago
8.0 - 13.0 years
10 - 14 Lacs
Bengaluru
Work from Office
Educational Bachelor of Engineering Service Line Strategic Technology Group Responsibilities Power Programmer is an important initiative within Global Delivery to develop a team of Full Stack Developers who will be working on complex engineering projects, platforms and marketplaces for our clients using emerging technologies., They will be ahead of the technology curve and will be constantly enabled and trained to be Polyglots., They are Go-Getters with a drive to solve end customer challenges and will spend most of their time in designing and coding, End to End contribution to technology oriented development projects., Providing solutions with minimum system requirements and in Agile Mode., Collaborate with Power Programmers., Open Source community and Tech User group., Custom Development of new Platforms & Solutions ,Opportunities., Work on Large Scale Digital Platforms and marketplaces., Work on Complex Engineering Projects using cloud native architecture ., Work with innovative Fortune 500 companies in cutting edge technologies., Co create and develop New Products and Platforms for our clients., Contribute to Open Source and continuously upskill in latest technology areas., Incubating tech user group Technical and Professional : Bigdata Spark, scala, hive, kafka Preferred Skills: Technology-Big Data-Hbase Technology-Big Data-Sqoop Technology-Java-Apache-Scala Technology-Functional Programming-Scala Technology-Big Data - Data Processing-Map Reduce Technology-Big Data - Data Processing-Spark
Posted 3 weeks ago
3.0 - 5.0 years
4 - 8 Lacs
Bengaluru
Work from Office
Educational Bachelor of Engineering Service Line Data & Analytics Unit Responsibilities A day in the life of an Infoscion As part of the Infosys delivery team, your primary role would be to interface with the client for quality assurance, issue resolution and ensuring high customer satisfaction. You will understand requirements, create and review designs, validate the architecture and ensure high levels of service offerings to clients in the technology domain. You will participate in project estimation, provide inputs for solution delivery, conduct technical risk planning, perform code reviews and unit test plan reviews. You will lead and guide your teams towards developing optimized high quality code deliverables, continual knowledge management and adherence to the organizational guidelines and processes. You would be a key contributor to building efficient programs/ systems and if you think you fit right in to help our clients navigate their next in their digital transformation journey, this is the place for you!If you think you fit right in to help our clients navigate their next in their digital transformation journey, this is the place for you! Technical and Professional : Technology-Functional Programming-ScalaTechnology-Java-Apache-Scala Preferred Skills: Technology-Java-Apache-Scala Technology-Functional Programming-Scala
Posted 3 weeks ago
3.0 - 5.0 years
5 - 7 Lacs
Mumbai, New Delhi, Bengaluru
Work from Office
We are seeking a skilled Big Data Developer with 3+ years of experience to develop, maintain, and optimize large-scale data pipelines using frameworks like Spark, PySpark, and Airflow. The role involves working with SQL, Impala, Hive, and PL/SQL for advanced data transformations and analytics, designing scalable data storage systems, and integrating structured and unstructured data using tools like Sqoop. The ideal candidate will collaborate with cross-functional teams to implement data warehousing strategies and leverage BI tools for insights. Proficiency in Python programming, workflow orchestration with Airflow, and Unix/Linux environments is essential. Locations : Mumbai, Delhi / NCR, Bengaluru , Kolkata, Chennai, Hyderabad, Ahmedabad, Pune, Remote
Posted 3 weeks ago
4.0 - 9.0 years
10 - 12 Lacs
Bengaluru, Doddakannell, Karnataka
Work from Office
We are seeking a highly skilled Data Engineer with expertise in ETL techniques, programming, and big data technologies. The candidate will play a critical role in designing, developing, and maintaining robust data pipelines, ensuring data accuracy, consistency, and accessibility. This role involves collaboration with cross-functional teams to enrich and maintain a central data repository for advanced analytics and machine learning. The ideal candidate should have experience with cloud-based data platforms, data modeling, and data governance processes. Location - Bengaluru,Doddakannell, Karnataka, Sarjapur Road
Posted 3 weeks ago
3.0 - 6.0 years
25 - 30 Lacs
Chennai
Work from Office
Zalaris is looking for Senior Data Engineer to join our dynamic team and embark on a rewarding career journey Liaising with coworkers and clients to elucidate the requirements for each task. Conceptualizing and generating infrastructure that allows big data to be accessed and analyzed. Reformulating existing frameworks to optimize their functioning. Testing such structures to ensure that they are fit for use. Preparing raw data for manipulation by data scientists. Detecting and correcting errors in your work. Ensuring that your work remains backed up and readily accessible to relevant coworkers. Remaining up-to-date with industry standards and technological advancements that will improve the quality of your outputs.
Posted 3 weeks ago
3.0 - 6.0 years
25 - 30 Lacs
Pune
Work from Office
Diverse Lynx is looking for Data Engineer to join our dynamic team and embark on a rewarding career journey Liaising with coworkers and clients to elucidate the requirements for each task. Conceptualizing and generating infrastructure that allows big data to be accessed and analyzed. Reformulating existing frameworks to optimize their functioning. Testing such structures to ensure that they are fit for use. Preparing raw data for manipulation by data scientists. Detecting and correcting errors in your work. Ensuring that your work remains backed up and readily accessible to relevant coworkers. Remaining up-to-date with industry standards and technological advancements that will improve the quality of your outputs.
Posted 3 weeks ago
3.0 - 5.0 years
5 - 9 Lacs
New Delhi, Ahmedabad, Bengaluru
Work from Office
We are seeking a skilled Big Data Developer with 3+ years of experience to develop, maintain, and optimize large-scale data pipelines using frameworks like Spark, PySpark, and Airflow. The role involves working with SQL, Impala, Hive, and PL/SQL for advanced data transformations and analytics, designing scalable data storage systems, and integrating structured and unstructured data using tools like Sqoop. The ideal candidate will collaborate with cross-functional teams to implement data warehousing strategies and leverage BI tools for insights. Proficiency in Python programming, workflow orchestration with Airflow, and Unix/Linux environments is essential. Location: Remote- Delhi / NCR,Bangalore/Bengaluru, Hyderabad/Secunderabad,Chennai, Pune,Kolkata,Ahmedabad,Mumbai
Posted 3 weeks ago
10.0 - 15.0 years
20 - 25 Lacs
Mumbai
Work from Office
About the Job The Red Hat Sales team is looking for an experienced Account Solutions Architect to join us in Mumbai, India. In this role, you will provide the first major experience our customers have with Red Hat while creating possibilities, solving problems, and establishing working relationships. You'll discover and analyze the business and technical needs of our customers, while collaborating with the Sales and Technical Delivery teams to help them invest wisely in the best solutions that will give their systems maximum flexibility, allowing them to run faster and more efficiently. You'll need to have extensive technical expertise, passion for open source, a thorough understanding of business processes, and the ability to identify and solve issues at the enterprise level. As an Account Solutions Architect, you will also need to have great communication and people skills. What will you do Develop strategic relationships with our customers to become a trusted adviser for Red Hat's offerings and solutions Demonstrate ownership of the technical relationships and technical sales cycle within a set of named accounts in your territory Ensure revenue and new business quotas/targets and service objectives are met while maintaining a high level of satisfaction among prospective and existing customer Provide presales technical support to our Enterprise Sales team Support evaluations of our offerings and technical proofs of concepts Respond to customer and partner inquiries, including requests for proposal (RFPs) and requests for information (RFIs) Provides pre-sales technical support for the development and implementation of complex solutions. Use in-depth domain & product knowledge to provide technical expertise to customers or partners through sales presentations, product demonstrations, workshop, evaluations and Proof of Concept/Technology (POCs/POTs) Assess potential application of company products to meet customer needs and prepare detailed product specifications for the development and implementation of customer solutions. Create detailed design and implementation specifications for complex products/applications/solutions. Provide consultation to prospective users/customers/partners on product capability assessment and validation What will you bring 10+ years of experience in the IT industry 5+ years of experience working as a presales engineer, consultant, IT architect, or equivalent supporting partners and enterprises 5+ years experience working in BFSI industry 5+ years of experience with solutions design or implementation complex application systems, cloud, multi-datacenter, and modernizing application environments, as well as multi-product integration Experience in Application Modernization, Digital Transformation and understanding of modern methodologies like Kubernetes, Containers & Microservices Architecture, agile development, and DevSecOps and associated capabilities like automation, orchestration, and configuration management Ability to explain technical concepts to non-technical audiences Familiarity with enterprise solutions and architectures, including cloud, big data, virtualization, storage, middleware, clustering, and high availability Excellent presentation skills; ability to present to small and large groups of mixed business, technical, management, and leadership audiences Record of developing relationships at engineering, commercial, and executive levels throughout large enterprise IT organizations Understanding of complex enterprise solutions and architectures Ability to work well in a team environment and collaborate with others to provide the best solutions Knowledge of sophisticated sales motions Willingness to travel up to 50% of the time Record of working with partners, distributors, consultants, and service partners to create solutions propositions around Red Hat's solutions Expertise in one or more offerings from the Red Hat portfolio like OpenShift, Ansible, RHEL, JBOSS, Application Services/Middleware The following are considered a plus Ability to handle multiple priorities and manage multiple large transactions between multiple organizations Experience working as an enterprise architect and strategizing with C-level users regarding technologies and roadmaps Red Hat Certified Architect (RHCA), Red Hat Certified Engineer (RHCE), VMware Certified Professional (VCP), or Information Technology Infrastructure Library (ITIL) certifications About Red Hat Red Hat is the worlds leading provider of enterprise open source software solutions, using a community-powered approach to deliver high-performing Linux, cloud, container, and Kubernetes technologies. Spread across 40+ countries, our associates work flexibly across work environments, from in-office, to office-flex, to fully remote, depending on the requirements of their role. Red Hatters are encouraged to bring their best ideas, no matter their title or tenure. We're a leader in open source because of our open and inclusive environment. We hire creative, passionate people ready to contribute their ideas, help solve complex problems, and make an impact. Inclusion at Red Hat Red Hats culture is built on the open source principles of transparency, collaboration, and inclusion, where the best ideas can come from anywhere and anyone. When this is realized, it empowers people from different backgrounds, perspectives, and experiences to come together to share ideas, challenge the status quo, and drive innovation. Our aspiration is that everyone experiences this culture with equal opportunity and access, and that all voices are not only heard but also celebrated. We hope you will join our celebration, and we welcome and encourage applicants from all the beautiful dimensions that compose our global village. Equal Opportunity Policy (EEO) Red Hat is proud to be an equal opportunity workplace and an affirmative action employer. We review applications for employment without regard to their race, color, religion, sex, sexual orientation, gender identity, national origin, ancestry, citizenship, age, veteran status, genetic information, physical or mental disability, medical condition, marital status, or any other basis prohibited by law. Red Hat supports individuals with disabilities and provides reasonable accommodations to job applicants. If you need assistance completing our online job application, email application-assistance@redhat.com . General inquiries, such as those regarding the status of a job application, will not receive a reply.
Posted 3 weeks ago
12.0 - 17.0 years
14 - 19 Lacs
Bengaluru
Work from Office
Project description Join our data engineering team to lead the design and implementation of advanced graph database solutions using Neo4j. This initiative supports the organization's mission to transform complex data relationships into actionable intelligence. You will play a critical role in architecting scalable graph-based systems, driving innovation in data connectivity, and empowering cross-functional teams with powerful tools for insight and decision-making. Responsibilities Graph Data Modeling & Implementation. Design and implement complex graph data models using Cypher and Neo4j best practices. Leverage APOC procedures, custom plugins, and advanced graph algorithms to solve domain-specific problems. Oversee integration of Neo4j with other enterprise systems, microservices, and data platforms. Develop and maintain APIs and services in Java, Python, or JavaScript to interact with the graph database. Mentor junior developers and review code to maintain high-quality standards. Establish guidelines for performance tuning, scalability, security, and disaster recovery in Neo4j environments. Work with data scientists, analysts, and business stakeholders to translate complex requirements into graph-based solutions. SkillsMust have 12+ years in software/data engineering, with at least 3-5 years hands-on experience with Neo4j. Lead the technical strategy, architecture, and delivery of Neo4j-based solutions. Design, model, and implement complex graph data structures using Cypher and Neo4j best practices. Guide the integration of Neo4j with other data platforms and microservices. Collaborate with cross-functional teams to understand business needs and translate them into graph-based models. Mentor junior developers and ensure code quality through reviews and best practices. Define and enforce performance tuning, security standards, and disaster recovery strategies for Neo4j. Stay up-to-date with emerging technologies in the graph database and data engineering space. Strong proficiency in Cypher query language, graph modeling, and data visualization tools (e.g., Bloom, Neo4j Browser). Solid background in Java, Python, or JavaScript and experience integrating Neo4j with these languages. Experience with APOC procedures, Neo4j plugins, and query optimization. Familiarity with cloud platforms (AWS) and containerization tools (Docker, Kubernetes). Proven experience leading engineering teams or projects. Excellent problem-solving and communication skills. Nice to have N/A
Posted 4 weeks ago
4.0 - 8.0 years
6 - 10 Lacs
Mumbai
Work from Office
Role Overview: Lead the architectural design and implementation of a secure, scalable Cloudera-based Data Lakehouse for one of India’s top public sector banks. Key Responsibilities: * Design end-to-end Lakehouse architecture on Cloudera * Define data ingestion, processing, storage, and consumption layers * Guide data modeling, governance, lineage, and security best practices * Define migration roadmap from existing DWH to CDP * Lead reviews with client stakeholders and engineering teams Required education Bachelor's Degree Preferred education Master's Degree Required technical and professional expertise Proven experience with Cloudera CDP, Spark, Hive, HDFS, Iceberg * Deep understanding of Lakehouse patterns and data mesh principles * Familiarity with data governance tools (e.g., Apache Atlas, Collibra) * Banking/FSI domain knowledge highly desirable.
Posted 4 weeks ago
8.0 - 13.0 years
9 - 13 Lacs
Bengaluru
Work from Office
As a Technical Specialist, you will develop and enhance Optical Network Management applications, leveraging experience in Optical Networks. You will work with fault supervision, and performance monitoring. Collaborating in an agile environment, you will drive innovation, optimize efficiency, and explore UI technologies like React. Your role will focus on designing, coding, testing, and improving network management applications to enhance functionality and customer satisfaction. You have: Bachelor's degree and 8 years of experience (or equivalent) in Optics Network. Hands-on working experience with CORE JAVA, Spring, Kafka, Zookeeper, Hibernate, and Python. Working knowledge of RDBMS, PL-SQL, Linux, Docker, and database concepts. Exposure to UI technologies like REACT. It would be nice if you also had: Domain knowledge in OTN, Photonic network management. Strong communication skills and the ability to manage complex relationships. Develop software for Network Management of Optics Division products, including Photonic/WDM, Optical Transport, SDH, and SONET. Enable user control over network configuration through Optics Network Management applications. Utilize Core Java, Spring, Kafka, Python, and RDBMS to build high-performing solutions for network configuration. Interface Optics Network Management applications with various Network Elements, providing a user-friendly graphical interface and implementing algorithms to simplify network management and reduce OPEX. Deploy Optics Network Management applications globally, supporting hundreds of installations for customers. Contribute to new developments and maintain applications as part of the development team, focusing on enhancing functionality and customer satisfaction.
Posted 4 weeks ago
5.0 - 8.0 years
4 - 8 Lacs
Hyderabad
Work from Office
Educational Bachelor of Engineering Service Line Data & Analytics Unit Responsibilities A day in the life of an Infoscion As part of the Infosys consulting team, your primary role would be to get to the heart of customer issues, diagnose problem areas, design innovative solutions and facilitate deployment resulting in client delight. You will develop a proposal by owning parts of the proposal document and by giving inputs in solution design based on areas of expertise. You will plan the activities of configuration, configure the product as per the design, conduct conference room pilots and will assist in resolving any queries related to requirements and solution design You will conduct solution/product demonstrations, POC/Proof of Technology workshops and prepare effort estimates which suit the customer budgetary requirements and are in line with organization’s financial guidelines Actively lead small projects and contribute to unit-level and organizational initiatives with an objective of providing high quality value adding solutions to customers. If you think you fit right in to help our clients navigate their next in their digital transformation journey, this is the place for you! Additional Responsibilities: Ability to develop value-creating strategies and models that enable clients to innovate, drive growth and increase their business profitability Good knowledge on software configuration management systems Awareness of latest technologies and Industry trends Logical thinking and problem solving skills along with an ability to collaborate Understanding of the financial processes for various types of projects and the various pricing models available Ability to assess the current processes, identify improvement areas and suggest the technology solutions One or two industry domain knowledge Client Interfacing skills Project and Team management Technical and Professional : Primary skills:Technology-Big Data - Data Processing-Spark Preferred Skills: Technology-Big Data - Data Processing-Spark
Posted 4 weeks ago
3.0 - 5.0 years
3 - 7 Lacs
Bengaluru
Work from Office
Educational Bachelor of Engineering Service Line Data & Analytics Unit Responsibilities A day in the life of an Infoscion As part of the Infosys delivery team, your primary role would be to interface with the client for quality assurance, issue resolution and ensuring high customer satisfaction. You will understand requirements, create and review designs, validate the architecture and ensure high levels of service offerings to clients in the technology domain. You will participate in project estimation, provide inputs for solution delivery, conduct technical risk planning, perform code reviews and unit test plan reviews. You will lead and guide your teams towards developing optimized high quality code deliverables, continual knowledge management and adherence to the organizational guidelines and processes. You would be a key contributor to building efficient programs/ systems and if you think you fit right in to help our clients navigate their next in their digital transformation journey, this is the place for you!If you think you fit right in to help our clients navigate their next in their digital transformation journey, this is the place for you! Technical and Professional : Primary skillsTechnology-Big Data - Data Processing-Map Reduce Preferred Skills: Technology-Big Data - Data Processing-Map Reduce
Posted 4 weeks ago
2.0 - 7.0 years
5 - 9 Lacs
Pune
Work from Office
Educational Bachelor of Engineering Service Line Data & Analytics Unit Responsibilities A day in the life of an Infoscion As part of the Infosys delivery team, your primary role would be to interface with the client for quality assurance, issue resolution and ensuring high customer satisfaction. You will understand requirements, create and review designs, validate the architecture and ensure high levels of service offerings to clients in the technology domain. You will participate in project estimation, provide inputs for solution delivery, conduct technical risk planning, perform code reviews and unit test plan reviews. You will lead and guide your teams towards developing optimized high quality code deliverables, continual knowledge management and adherence to the organizational guidelines and processes. You would be a key contributor to building efficient programs/ systems and if you think you fit right in to help our clients navigate their next in their digital transformation journey, this is the place for you!If you think you fit right in to help our clients navigate their next in their digital transformation journey, this is the place for you! Additional Responsibilities: Knowledge of more than one technology Basics of Architecture and Design fundamentals Knowledge of Testing tools Knowledge of agile methodologies Understanding of Project life cycle activities on development and maintenance projects Understanding of one or more Estimation methodologies, Knowledge of Quality processes Basics of business domain to understand the business requirements Analytical abilities, Strong Technical Skills, Good communication skills Good understanding of the technology and domain Ability to demonstrate a sound understanding of software quality assurance principles, SOLID design principles and modelling methods Awareness of latest technologies and trends Excellent problem solving, analytical and debugging skills Technical and Professional : Primary skillsHadoop, Hive, HDFS Preferred Skills: Technology-Big Data - Hadoop-Hadoop
Posted 4 weeks ago
5.0 - 9.0 years
5 - 9 Lacs
Bengaluru
Work from Office
Educational Bachelor of Engineering,BCA,BSc,MCA,MTech,MSc Service Line Data & Analytics Unit Responsibilities "1. 5-8 yrs exp in Azure (Hands on experience in Azure Data bricks and Azure Data Factory)2. Good knowledge in SQL, PySpark.3. Should have knowledge in Medallion architecture pattern4. Knowledge on Integration Runtime5. Knowledge on different ways of scheduling jobs via ADF (Event/Schedule etc)6. Should have knowledge of AAS, Cubes.7. To create, manage and optimize the Cube processing.8. Good Communication Skills.9. Experience in leading a team" Additional Responsibilities: Good knowledge on software configuration management systems Strong business acumen, strategy and cross-industry thought leadership Awareness of latest technologies and Industry trends Logical thinking and problem solving skills along with an ability to collaborate Two or three industry domain knowledge Understanding of the financial processes for various types of projects and the various pricing models available Client Interfacing skills Knowledge of SDLC and agile methodologies Project and Team management Preferred Skills: Technology-Big Data - Data Processing-Spark
Posted 4 weeks ago
5.0 - 7.0 years
5 - 5 Lacs
Kochi, Hyderabad, Thiruvananthapuram
Work from Office
Key Responsibilities Develop & Deliver: Build applications/features/components as per design specifications, ensuring high-quality code adhering to coding standards and project timelines. Testing & Debugging: Write, review, and execute unit test cases; debug code; validate results with users; and support defect analysis and mitigation. Technical Decision Making: Select optimal technical solutions including reuse or creation of components to enhance efficiency, cost-effectiveness, and quality. Documentation & Configuration: Create and review design documents, templates, checklists, and configuration management plans; ensure team compliance. Domain Expertise: Understand customer business domain deeply to advise developers and identify opportunities for value addition; obtain relevant certifications. Project & Release Management: Manage delivery of modules/user stories, estimate efforts, coordinate releases, and ensure adherence to engineering processes and timelines. Team Leadership: Set goals (FAST), provide feedback, mentor team members, maintain motivation, and manage people-related issues effectively. Customer Interaction: Clarify requirements, present design options, conduct demos, and build customer confidence through timely and quality deliverables. Technology Stack: Expertise in Big Data technologies (PySpark, Scala), plus preferred skills in AWS services (EMR, S3, Glue, Airflow, RDS, DynamoDB), CICD tools (Jenkins), relational & NoSQL databases, microservices, and containerization (Docker, Kubernetes). Soft Skills & Collaboration: Communicate clearly, work under pressure, handle dependencies and risks, collaborate with cross-functional teams, and proactively seek/offers help. Required Skills Big Data,Pyspark,Scala Additional Comments: Must-Have Skills Big Data (Py Spark + Java/Scala) Preferred Skills: AWS (EMR, S3, Glue, Airflow, RDS, Dynamodb, similar) CICD (Jenkins or another) Relational Databases experience (any) No SQL databases experience (any) Microservices or Domain services or API gateways or similar Containers (Docker, K8s, similar)
Posted 1 month ago
Upload Resume
Drag or click to upload
Your data is secure with us, protected by advanced encryption.
Browse through a variety of job opportunities tailored to your skills and preferences. Filter by location, experience, salary, and more to find your perfect fit.
We have sent an OTP to your contact. Please enter it below to verify.
Accenture
39581 Jobs | Dublin
Wipro
19070 Jobs | Bengaluru
Accenture in India
14409 Jobs | Dublin 2
EY
14248 Jobs | London
Uplers
10536 Jobs | Ahmedabad
Amazon
10262 Jobs | Seattle,WA
IBM
9120 Jobs | Armonk
Oracle
8925 Jobs | Redwood City
Capgemini
7500 Jobs | Paris,France
Virtusa
7132 Jobs | Southborough