Get alerts for new jobs matching your selected skills, preferred locations, and experience range. Manage Job Alerts
4.0 - 6.0 years
12 - 18 Lacs
Chennai, Bengaluru
Work from Office
Key Skills : Python, SQL, PySpark, Databricks, AWS, Data Pipeline, Data Integration, Airflow, Delta Lake, Redshift, S3, Data Security, Cloud Platforms, Life Sciences. Roles & Responsibilities : Develop and maintain robust, scalable data pipelines for ingesting, transforming, and optimizing large datasets from diverse sources. Integrate multi-source data into performant, query-optimized formats such as Delta Lake, Redshift, and S3. Tune data processing jobs and storage layers to ensure cost efficiency and high throughput. Automate data workflows using orchestration tools like Airflow and Databricks APIs for ingestion, transformation, and reporting. Implement data validation and quality checks to ensure reliable and accurate data. Manage and optimize AWS and Databricks infrastructure to support scalable data operations. Lead cloud platform migrations and upgrades, transitioning legacy systems to modern, cloud-native solutions. Enforce security best practices, ensuring compliance with regulatory standards such as IAM and data encryption. Collaborate with cross-functional teams, including data scientists, analysts, and business stakeholders to deliver data solutions. Experience Requirement : 4-6 years of hands-on experience in data engineering with expertise in Python, SQL, PySpark, Databricks, and AWS. Strong background in designing and building data pipelines, and optimizing data storage and processing. Proficiency in using cloud services such as AWS (S3, Redshift, Lambda) for building scalable data solutions. Hands-on experience with containerized environments and orchestration tools like Airflow for automating data workflows. Expertise in data migration strategies and transitioning legacy data systems to modern cloud platforms. Experience with performance tuning, cost optimization, and lifecycle management of cloud data solutions. Familiarity with regulatory compliance (GDPR, HIPAA) and security practices (IAM, encryption). Experience in the Life Sciences or Pharma domain is highly preferred, with an understanding of industry-specific data requirements. Strong problem-solving abilities with a focus on delivering high-quality data solutions that meet business needs. Education : Any Graduation.
Posted 4 weeks ago
6.0 - 9.0 years
18 - 25 Lacs
Chennai
Work from Office
Key Skills : Python, SQL, PySpark, Databricks, AWS, Data Pipeline, Data Governance, Data Security, Leadership, Cloud Platforms, Life Sciences, Migration, Airflow. Roles & Responsibilities : Lead a team of data engineers and developers, defining technical strategy, best practices, and architecture for data platforms. Architect, develop, and manage scalable, secure, and high-performing data solutions on AWS and Databricks. Oversee the design and development of robust data pipelines for ingestion, transformation, and storage of large-scale datasets. Enforce data validation, lineage, and quality checks across the data lifecycle, defining standards for metadata, cataloging, and governance. Design automated workflows using Airflow, Databricks Jobs/APIs, and other orchestration tools for end-to-end data operations. Implement performance tuning strategies, cost optimization best practices, and efficient cluster configurations on AWS/Databricks. Define and enforce data security standards, IAM policies, and ensure compliance with industry-specific regulatory frameworks. Work closely with business users, analysts, and data scientists to translate requirements into scalable technical solutions. Drive strategic data migrations from on-prem/legacy systems to cloud-native platforms with minimal risk and downtime. Mentor junior engineers, contribute to talent development, and ensure continuous learning within the team. Experience Requirement : 6-9 years of hands-on experience in data engineering with expertise in Python, SQL, PySpark, Databricks, and AWS. Strong leadership experience in data engineering or data architecture roles, with a proven track record in leading teams and delivering large-scale data solutions. Expertise in designing and developing data pipelines, optimizing performance, and ensuring data quality. Solid experience with cloud platforms (AWS, Databricks), data governance, and security best practices. Experience in data migration strategies and leading transitions from on-premises to cloud-based environments. Experience in the Life Sciences or Pharma domain is highly preferred, with a deep understanding of industry-specific data requirements. Strong communication and interpersonal skills with the ability to collaborate across teams and engage stakeholders. Education : Any Graduation.
Posted 4 weeks ago
4.0 - 8.0 years
0 Lacs
kolkata, west bengal
On-site
You are an experienced SAP BASIS professional responsible for managing and optimizing SAP ECC 6.0 and S/4HANA environments on a Private Cloud infrastructure. Your role involves system administration, performance tuning, upgrades, and ensuring high availability of SAP applications. You must have expertise in SAP BASIS administration and supported cloud technologies. As the SAP BASIS Administrator, your key responsibilities include: - Installing, configuring, and maintaining SAP ECC 6.0 and related SAP landscapes in a Private Cloud environment. - Performing system upgrades, migrations, and patching activities. - Managing system refreshes, client copies, and database administration. - Performing transport management and landscape synchronization across DEV, QA, and PROD environments. Additionally, you will be involved in Cloud & Infrastructure Management by: - Managing SAP on Private Cloud platforms in DC environment and DR setup. - Monitoring and optimizing cloud resources for cost efficiency and performance. - Maintaining backup, disaster recovery, and high-availability solutions for SAP workloads. - Collaborating with cloud service providers to troubleshoot and resolve infrastructure-related issues. You will also focus on Performance Tuning & Security by: - Conducting system health checks, tuning, and optimization for SAP applications and databases. - Monitoring SAP system logs, work processes, and background jobs to ensure system stability. - Implementing and enforcing SAP security best practices, including user management, role assignments, and authorization controls. - Performing vulnerability assessments and compliance checks to align with ITGC Audit and other industry regulations. If you are qualified and interested in this role, please forward your updated CV to ssi.management@gmail.com.,
Posted 4 weeks ago
5.0 - 9.0 years
0 Lacs
haryana
On-site
The IWPT Data Platform team at Morningstar is looking for a seasoned Senior Software Engineer to take charge of developing and maintaining Datafeed applications using Node.js and ASP.NET. These applications play a crucial role in delivering precise and timely data to clients, ensuring the efficiency of their financial systems. As a Senior Software Engineer, your responsibilities will include overseeing the complete development, reliability, and performance of our Datafeed applications. You will lead projects, guide junior engineers, and work closely with clients and internal teams to enhance functionality, troubleshoot issues, and drive continuous improvements. Your key responsibilities will include leading the development and upkeep of Datafeed applications using JavaScript (Node.js) or C# (ASP.NET), collaborating with clients and internal teams to resolve complex issues related to Datafeed processing, data accuracy, and application functionality, designing and implementing enhancements to boost the reliability, performance, and usability of applications, mentoring junior engineers to enhance their development skills, and documenting development processes and resolutions to promote knowledge sharing and continuous improvement. To be successful in this role, you should possess a minimum of 5 years of experience in software development or a related field, an advanced proficiency in Node.js or ASP.NET, a strong understanding of application development and architecture, extensive knowledge of data processing and manipulation techniques, familiarity with relational and non-relational databases and queries, expertise in troubleshooting and debugging techniques, excellent communication skills, strong problem-solving and analytical skills, the ability to work independently and lead a team, a customer-focused mindset with a commitment to delivering high-quality solutions, a bachelor's degree in computer science - engineering, or a related field, and experience in developing and supporting Datafeed applications or similar financial systems is a plus. Preferred skills include experience with cloud platforms such as AWS or Azure, knowledge of financial markets and Datafeeds, and experience with DevOps practices and tools. Morningstar is an equal opportunity employer offering a hybrid work environment that allows for remote work and in-person collaboration on a regular basis. Various benefits are also available to facilitate flexibility as needs evolve, ensuring meaningful engagement with global colleagues. Upon acceptance of an offer, disclosure of personal and related investments to the Compliance team is required for review to ensure compliance with Code of Ethics requirements. Any identified conflicts of interest will necessitate the immediate liquidation of holdings. Additionally, depending on your department and location of work, certain employee accounts must be held with an approved broker. This job offers a hybrid work environment, allowing you to work remotely and collaborate in-person each week. Although some positions are fully remote, the company believes that regular in-person collaboration, typically three days each week, enhances productivity. Multiple benefits are available to support flexibility as circumstances change, ensuring meaningful interaction with global colleagues.,
Posted 4 weeks ago
2.0 - 6.0 years
0 Lacs
karnataka
On-site
As the Senior Analyst, Business Intelligence at LSEG, you will play a crucial role in driving data-informed decision-making and continuous improvement within Colleague Services. Your primary focus will be on designing, maintaining, and enhancing dashboards that visualize key performance indicators by utilizing data from ServiceNow and other platforms to generate actionable insights. Your expertise in data analysis and reporting will be instrumental in identifying opportunities for improvement across corporate functions and providing valuable recommendations to enhance efficiency. Your responsibilities will include designing and maintaining Power BI dashboards to deliver critical insights into Colleague Services performance metrics such as ticket volumes, MTTR, and CSAT scores. You will analyze trends in ticket volumes and performance data to provide actionable insights to leadership and collaborate with key partners to align on performance improvement strategies. Additionally, you will develop a dashboard for the Digital Workplace to monitor content management and user journeys, ensuring ongoing updates based on evolving business needs. Your role will also involve ensuring compliance with the organization's Enterprise Risk Management Framework and corporate governance standards, as well as leading and supporting ServiceNow development projects to meet timelines and deliverables. Continuous learning and development will be encouraged as you stay up-to-date with industry trends, new technologies, and analytics methodologies relevant to the role. The ideal candidate for this position will have a Bachelor's degree in a related field such as Business Analytics or Data Science, along with 2-3 years of experience in BI development. Strong project management skills, effective communication abilities, and experience with ESM systems like ServiceNow are desirable. Proficiency in data visualization, data warehousing, and programming skills in SQL will be advantageous, as well as familiarity with cloud platforms such as AWS and Azure. Your eagerness to learn, adapt, and collaborate within cross-functional teams will be essential to meet evolving business needs. Joining LSEG means being part of a dynamic organization with a global presence across 65 countries. You will have the opportunity to contribute to driving financial stability, empowering economies, and supporting sustainable growth. The values of Integrity, Partnership, Excellence, and Change will guide your decision-making and actions, fostering a collaborative and creative culture where new ideas are encouraged. LSEG offers a range of benefits and support, including healthcare, retirement planning, paid volunteering days, and wellbeing initiatives, while valuing individuality and diversity in the workforce.,
Posted 4 weeks ago
5.0 - 9.0 years
0 Lacs
haryana
On-site
As an AI Decision Science Consultant at Accenture Strategy & Consulting, specifically in the Global Network - Data & AI practice under the CMT Software & Platforms team, you will play a pivotal role in helping clients leverage analytics to achieve high performance and make better decisions. In collaboration with onsite counterparts, you will drive the development and delivery of Data & AI solutions for SaaS and PaaS clients. You will have the opportunity to work with a diverse team of talented professionals experienced in leading statistical tools and methods. From gathering business requirements to developing and testing AI algorithms tailored to address specific business challenges, you will be involved in the end-to-end process of delivering AI solutions. Your role will also include monitoring project progress, managing risks, and fostering positive client relationships by ensuring alignment between project deliverables and client expectations. In this role, you will mentor and guide a team of AI professionals, promoting a culture of collaboration, innovation, and excellence. Continuous learning and professional development will be supported by Accenture, enabling you to enhance your skills and certifications in SaaS & PaaS technologies. As part of the Data & AI practice, you will be at the forefront of leveraging AI technologies such as Generative AI frameworks and statistical models to drive business performance improvement initiatives. To excel in this role, you should possess a bachelor's or master's degree in computer science, engineering, data science, or a related field. With at least 5 years of experience in working on AI projects, you should have hands-on exposure to AI technologies, statistical packages, and machine learning techniques. Proficiency in programming languages such as R, Python, Java, SQL, and experience in working with cloud platforms like AWS, Azure, or Google Cloud will be crucial. Your strong analytical, problem-solving, and project management skills will be essential in navigating complex issues and delivering successful outcomes. Excellent communication and interpersonal skills will enable you to engage effectively with clients and internal stakeholders, while your ability to work with large datasets and present insights will drive informed decision-making. Join us at Accenture Strategy & Consulting and be part of a dynamic team dedicated to helping clients unlock new business opportunities through Data & AI solutions.,
Posted 4 weeks ago
5.0 - 9.0 years
0 Lacs
haryana
On-site
About the Role: We are seeking an IT Solution Consultant with expertise in insurance and reinsurance technology to act as the main point of contact for clients, providing consultation, implementation, and support for digital insurance solutions. The role requires a mix of insurance domain expertise, technology implementation skills, and client engagement experience. Key Responsibilities: Act as the primary POC for clients, addressing all technology-related queries for insurance solutions. Lead digital transformation projects, including API integrations, SaaS platforms, and process automation. Work closely with underwriters, claims teams, brokers, and product managers to optimize workflows. Drive pre-sales, product demonstrations, and client solutioning for InsurTech products. Collaborate with internal teams to develop customized insurance technology solutions. Ensure compliance with insurance and reinsurance regulatory standards (Lloyds, Solvency II, IFRS 17, etc.) Required Skills & Experience: 5+ years of experience in insurance/reinsurance technology consulting. Expertise in policy administration, underwriting, and claims automation. Strong understanding of API integrations, cloud platforms (AWS, Azure), and SaaS-based InsurTech solutions. Experience with Guidewire, DuckCreek, SICS, TCS BaNCS, or similar platforms. Hands-on with SQL, Power BI, Looker, or other BI tools for data analysis. Excellent communication & stakeholder management skills. Nice to Have: Experience in AI/ML-based insurance automation. Knowledge of RPA tools for process optimization. Exposure to enterprise architecture and system integration.,
Posted 4 weeks ago
8.0 - 12.0 years
0 Lacs
pune, maharashtra
On-site
Job Overview Kale Logistics, a leading software SaaS product-based firm, is seeking an experienced and results-oriented Technology Architect specializing in technical architecture to lead our technical engineering practice. This pivotal role involves overseeing and optimizing the technical architecture of our software products, driving innovation, and ensuring exceptional user experiences. As a key leader in our organization, the Technology Architect will play a crucial role in enhancing the scalability, reliability, and efficiency of our solutions. Key Responsibilities Technical Engineering Leadership : Lead and manage the technical engineering team to ensure high-quality delivery and continuous improvement in product architecture. Technical Architecture Design : Design and implement scalable, high-performing technical architectures to support SaaS products and services. Optimization and Tuning : Collaborate with development teams to optimize code, algorithms, and infrastructure for maximum performance and efficiency. Best Practices Implementation : Establish and enforce technical architecture best practices and standards across the organization. Troubleshooting and Issue Resolution : Investigate and resolve complex technical issues, providing technical expertise and guidance to teams. Mentorship and Training : Mentor junior engineers and provide training on technical architecture methodologies and tools. Collaboration : Work closely with cross-functional teams, including developers, testers, product managers, and operations teams, to ensure alignment with technical goals. Technical Monitoring and Reporting : Implement monitoring tools and processes to track system performance and generate reports for stakeholders. Innovation and Research : Stay updated with the latest trends and technologies in technical architecture and apply them to drive innovation. Must-Have Bachelor's or master's degree in computer science, Engineering, or related field. Extensive experience (8+ years) in technical architecture, including hands-on experience with architecture design, optimization, and implementation. Strong understanding of software architecture, design patterns, and distributed systems. Proficiency in programming languages, .net core and angular and databases (e.g., SQL, NoSQL). Proven ability to lead and mentor teams, as well as manage complex projects. Good-to-Have Experience with cloud platforms (e.g., AWS, Azure, Google Cloud) and containerization (e.g., Docker, Kubernetes). Familiarity with microservices architecture and modern development practices (e.g., CI/CD, DevOps). Certifications in technical architecture or related fields. Strong problem-solving skills and ability to work under pressure. Excellent communication and collaboration skills. Why Join Kale Logistics Opportunity to lead and shape the technical engineering practice. Collaborative and innovative work environment. Competitive salary and benefits package. Opportunities for professional growth and development. If you are a passionate and experienced Technology Architect specializing in technical architecture and are ready to lead and drive excellence at Kale Logistics, we would love to hear from you. Apply today to be part of our dynamic team and contribute to our mission of delivering world-class SaaS solutions. (ref:hirist.tech),
Posted 4 weeks ago
4.0 - 10.0 years
0 Lacs
pune, maharashtra
On-site
As a Scala Developer at our company located in Pune, you will be responsible for developing, testing, and maintaining scalable, robust, and high-performance applications using Scala. You will collaborate with product managers, architects, and other developers to define software requirements and deliverables. Your role will involve writing clean, maintainable, and efficient code following best practices, as well as integrating third-party APIs and services as needed. It will be your responsibility to optimize application performance and ensure scalability. To excel in this role, you should possess a strong proficiency in the Scala programming language with 4 to 10 years of experience. Familiarity with functional programming concepts and patterns is essential. You should also have proficiency in building RESTful APIs and web services, along with knowledge of database technologies such as PostgreSQL, MongoDB, or Cassandra. Experience with distributed computing frameworks like Apache Spark and Kafka will be beneficial. An understanding of software development principles including Agile, TDD, and CI/CD is required for this position. Additionally, familiarity with cloud platforms like AWS, Azure, or Google Cloud will be advantageous. If you are passionate about developing high-quality applications using Scala and possess the required skills and experience, we would like to hear from you. Join our team and contribute to building innovative solutions that drive our business forward.,
Posted 4 weeks ago
1.0 - 5.0 years
0 Lacs
vadodara, gujarat
On-site
Job Title: Data Architect Experience : 3 to 4 Location : Vadodara , Gujarat Contact : 9845135287 Job Summary We are seeking a highly skilled and experienced Data Architect to join our team. As a Data Architect, you will play a crucial role in assessing the current state of our data landscape and working closely with the Head of Data to develop a comprehensive data strategy that aligns with our organisational goals. Your primary responsibility will be to understand and map our current data environments and then help develop a detailed roadmap that will deliver a data estate that enables our business to deliver on its core objectives. Main Duties & Responsibilities The role core duties include but are not limited to: Assess the current state of our data infrastructure, including data sources, storage systems, and data processing pipelines. Collaborate with the Data Ops Director to define and refine the data strategy, taking into account business requirements, scalability, and performance. Design and develop a cloud-based data architecture, leveraging Azure technologies such as Azure Data Lake Storage, Azure Synapse Analytics, and Azure Data Factory. Define data integration and ingestion strategies to ensure smooth and efficient data flow from various sources into the data lake and warehouse. Develop data modelling and schema design to support efficient data storage, retrieval, and analysis. Implement data governance processes and policies to ensure data quality, security, and compliance. Collaborate with cross-functional teams, including data engineers, data scientists, and business stakeholders, to understand data requirements and provide architectural guidance. Conduct performance tuning and optimization of the data infrastructure to meet business and analytical needs. Stay updated with the latest trends and advancements in data management, cloud technologies, and industry best practices. Provide technical leadership and mentorship to junior team members. Key Skills Proven work experience as a Data Architect or in a similar role, with a focus on designing and implementing cloud-based data solutions using Azure technology. Strong knowledge of data architecture principles, data modelling techniques, and database design concepts. Experience with cloud platforms, particularly Azure, and a solid understanding of their data-related services and tools. Proficiency in SQL and one or more programming languages commonly used for data processing and analysis (e.g., Python, R, Scala). Familiarity with data integration techniques, ETL/ELT processes, and data pipeline frameworks. Knowledge of data governance, data security, and compliance practices. Strong analytical and problem-solving skills, with the ability to translate business requirements into scalable and efficient data solutions. Excellent communication and collaboration skills, with the ability to work effectively with cross-functional teams and stakeholders. Ability to adapt to a fast-paced and dynamic work environment and manage multiple priorities simultaneously. Working relationships Liaison with stakeholders at all levels of the organisation Communication: Communicate with leadership and colleagues in relation to all business activities Highly articulate and able to explain complex concepts in bite size chunks Strong ability to provide clear written reporting and analysis Personal Qualities Ability to work to deadlines Commercially mindful and able to deliver solution to maximise value Good time management skills and ability to work to deadlines Strong analytical skills Accurate with excellent attention to detail Personal strength and resilience Adaptable and embraces change Reliable, conscientious and hardworking Approachable and professional Show willingness to learn however recognise limits of ability and when to seek advice Knowledge / Key Skills: Essential Desirable Experience of Azure Development and design principals Enterprise level Data warehousing design and implementation Architecture Principles Proficiency in SQL development. Familiarity with data integration techniques, ETL/ELT processes, and data pipeline frameworks. Knowledge of data governance, data security, and compliance practices. Strong experience mapping existing data landscape and developing roadmap to deliver business requirements. Excellent communication and collaboration skills, with the ability to work effectively with cross-functional teams and stakeholders. Ability to adapt to a fast-paced and dynamic work environment and manage multiple priorities simultaneously. Knowledge of Enterprise Architecture frameworks (Eg. TOGAF) Programming languages such as R, Python, Scala etc Job Type: Full-time Experience: total work: 1 year (Preferred) Work Location: In person,
Posted 4 weeks ago
5.0 - 9.0 years
0 Lacs
karnataka
On-site
You should have experience in understanding and translating data, analytic requirements, and functional needs into technical requirements while collaborating with global customers. Your responsibilities will include designing cloud-native data architectures to support scalable, real-time, and batch processing. You will be required to build and maintain data pipelines for large-scale data management in alignment with data strategy and processing standards. Additionally, you will define strategies for data modeling, data integration, and metadata management. Your role will also involve having strong experience in database, data warehouse, and data lake design and architecture. You should be proficient in leveraging cloud platforms such as AWS, Azure, or GCP for data storage, compute, and analytics services. Experience in database programming using various SQL flavors is essential. Moreover, you will need to implement data governance frameworks encompassing data quality, lineage, and cataloging. Collaboration with cross-functional teams, including business analysts, data engineers, and DevOps teams, will be a key aspect of this role. Familiarity with the Big Data ecosystem, whether on-premises (Hortonworks/MapR) or in the Cloud, is required. You should be able to evaluate emerging cloud technologies and suggest enhancements to the data architecture. Proficiency in any orchestration tool like Airflow or Oozie for scheduling pipelines is preferred. Hands-on experience in utilizing tools such as Spark Streaming, Kafka, Databricks, and Snowflake is necessary. You should be adept at working in an Agile/Scrum development process and optimizing data systems for cost efficiency, performance, and scalability.,
Posted 4 weeks ago
15.0 - 19.0 years
0 Lacs
maharashtra
On-site
At PwC, we focus on leveraging advanced technologies and techniques in data and analytics engineering to design and develop robust data solutions for our clients. As a Director - Generative AI with over 15 years of experience, you will play a crucial role in transforming raw data into actionable insights, enabling informed decision-making, and driving business growth. Your main responsibilities will involve developing and implementing advanced AI and ML solutions to drive innovation and enhance business processes. You should possess proficiency in Generative AI based application development and have a strong focus on leveraging AI and ML technologies. Strong experience in Python programming and related frameworks such as Django and Flask is essential. You must have extensive experience in building scalable and robust applications using Python. A solid understanding of data engineering principles and technologies including ETL, data pipelines, and data warehousing is required. Familiarity with AI and ML concepts, algorithms, and libraries such as TensorFlow and PyTorch is necessary. Knowledge of cloud platforms like AWS, Azure, and GCP and their AI/ML services is also expected. Experience with database systems such as SQL, NoSQL, and data modeling is a plus. Strong problem-solving and analytical skills are essential, with the ability to translate business requirements into technical solutions. You should have excellent leadership and team management skills, with the ability to motivate and develop a high-performing team. Strong communication and collaboration skills are also crucial, as you will be working effectively in cross-functional teams. Being self-motivated and proactive with a passion for learning and staying up-to-date with emerging technologies and industry trends is key to success in this role. Educational background in BE / B.Tech / MCA / M.Sc / M.E / M.Tech / MBA would be preferred for this position.,
Posted 4 weeks ago
2.0 - 6.0 years
0 Lacs
navi mumbai, maharashtra
On-site
Dimensionless Technologies is a AI product company that offers AI-based solutions to a diverse range of industries. Founded in 2016 in Mumbai, India, our journey has been marked by a relentless pursuit of excellence and a commitment to innovation. We are looking for a Smart, Intelligent and Hard working software Test Engineer. Responsibilities: Design, develop, and execute test plans and test cases based on software requirements. Perform manual testing to ensure software functionality and usability. Conduct performance testing to evaluate system responsiveness and stability under various conditions. Develop and maintain automation test scripts using Python to streamline testing processes. Utilize test management tools (e.g., JIRA, TestRail, or HP ALM) to organize, track, and report testing activities and results. Collaborate with developers and cross-functional teams in Agile environments, actively participating in daily stand-ups, sprint planning, and retrospectives. Identify, document, and report bugs, errors, and inconsistencies in software. Conduct testing in cloud-based environments, ensuring compatibility and performance on cloud platforms. Continuously evaluate and implement new testing tools, methodologies, and cloud-based solutions. Required Skills: Strong knowledge of manual testing techniques and tools. Expertise in performance testing tools like JMeter or LoadRunner. Proficiency in automation testing, preferred Python (e.g., Selenium, Pytest). Familiarity with test management tools like Azure board, JIRA, TestRail, or HP ALM for efficient test lifecycle management. Hands-on experience with cloud platforms like AWS, Azure, or Google Cloud for testing and deployments. Familiarity with Agile methodologies and software development life cycles (SDLC). Strong problem-solving and analytical abilities. Excellent communication and documentation skills. Preferred Qualifications: Bachelors degree in Computer Science, Engineering, or a related field. Experience with API testing and tools like Postman or RestAssured. Understanding of CI/CD pipelines and version control systems like Git. Knowledge of database testing and SQL queries.,
Posted 4 weeks ago
7.0 - 11.0 years
0 Lacs
jaipur, rajasthan
On-site
Job Summary: The Enterprise Full Stack Presales Engineering Manager is responsible for leading presales efforts, crafting strategic solutions, and driving sales growth by aligning client needs with tailored software solutions. This role requires a deep technical understanding of software development, cloud technologies, and enterprise architecture, along with strong business acumen and excellent communication skills. The ideal candidate will work closely with sales, product, and engineering teams to design and present custom solutions that address complex business challenges. Key Responsibilities: Lead presales engagements by understanding client requirements and architecting tailored software solutions. Develop and deliver compelling presentations, demonstrations, and Proof of Concepts (POCs) to showcase the value of custom solutions. Collaborate with sales teams to drive customer adoption and revenue growth through strategic technology initiatives. Provide technical expertise in full-stack development, cloud platforms, and enterprise application design to support sales efforts. Work closely with product and engineering teams to influence product roadmaps based on market needs and customer feedback. Respond to RFPs/RFIs with well-structured proposals and technical solutions. Conduct competitive analysis to position custom software offerings effectively in the market. Ensure seamless integration of custom solutions with clients existing systems through API development and deployment best practices. Technical Skills: Full-Stack Development: Expertise in front-end and back-end technologies, including JavaScript (React, Angular, or Vue), Node.js, Java, or .NET. Cloud Platforms: Experience with AWS, Azure, or Google Cloud for deploying enterprise solutions. Database Management: Knowledge of SQL, PostgreSQL, MySQL, and NoSQL databases. Solution Architecture: Ability to design scalable and secure custom software solutions. API Integration: Strong understanding of REST APIs, GraphQL, SDKs, and enterprise system integration. DevOps & CI/CD: Experience with containerization (Docker, Kubernetes) and deployment pipelines. Business & Soft Skills: Presales & Proposal Writing: Expertise in crafting proposals, responding to RFPs/RFIs, and developing POCs. Client Engagement: Ability to translate business pain points into tailored software solutions. Presentation & Storytelling: Strong communication skills to present technical concepts to both technical and non-technical stakeholders. Industry Awareness: Knowledge of custom software solutions across industries such as finance, healthcare, retail, and manufacturing. Competitive Analysis: Understanding technology market trends and competitor positioning. Collaboration & Teamwork: Ability to work closely with sales, product, and engineering teams to ensure successful client engagements. Preferred Qualifications: Bachelor's/Masters degree in Computer Science, Software Engineering, or a related field. 7+ years of experience in software development, solution architecture, or presales engineering roles. Proven track record in leading presales initiatives and driving business growth. Strong problem-solving skills and ability to handle complex software-driven sales cycles. Experience working with enterprise clients and large-scale software implementations.,
Posted 4 weeks ago
4.0 - 8.0 years
0 Lacs
bhopal, madhya pradesh
On-site
As a Senior OutSystems Developer at our company located in Bhopal, you will be an integral part of our team with 4 to 7 years of experience. Your primary responsibility will be to design, develop, and deploy scalable applications using the OutSystems low-code platform. You will work closely with business analysts, UX designers, and stakeholders to understand requirements and deliver technical solutions that meet the performance, security, and scalability needs of the applications. Your key responsibilities will include developing reusable components, libraries, and APIs, integrating OutSystems applications with third-party systems, databases, and APIs, and ensuring high code quality through code reviews, troubleshooting, and debugging. Additionally, you will mentor junior developers, contribute to team knowledge sharing, and stay updated with OutSystems best practices, emerging trends, and platform updates. To excel in this role, you should possess a strong understanding of OutSystems architecture, UI design, and data modeling. Experience in developing web and mobile applications using OutSystems, knowledge of integrations such as REST/SOAP APIs, databases, and third-party services, and familiarity with Agile methodologies, DevOps, and CI/CD pipelines are essential. Certification in OutSystems development (Professional or Expert Developer) would be a plus. Preferred skills include hands-on experience with OutSystems Service Studio, Integration Studio, and Lifetime, knowledge of JavaScript, CSS, HTML, and Bootstrap for UI customization, and exposure to cloud platforms (Azure, AWS, GCP) and DevOps tools. Strong problem-solving skills and the ability to thrive in a fast-paced environment are highly valued. Joining our team offers you a competitive salary and benefits, the opportunity to work on cutting-edge OutSystems projects, a collaborative and innovative work culture, as well as career growth and training opportunities. If you are passionate about OutSystems development and eager to contribute to impactful projects, we welcome you to apply for this exciting opportunity.,
Posted 4 weeks ago
10.0 - 15.0 years
20 - 35 Lacs
Hyderabad
Work from Office
Description: Hiring AWS DevOps Lead for the Hyderabad location Requirements: DevOps Lead We are looking for a visionary and hands-on DevOps Lead to drive the strategic direction, implementation, and continuous improvement of our DevOps practices across the organization. This individual will be responsible for leading high-performing DevOps teams, developing scalable CI/CD pipelines, enhancing cloud infrastructure reliability, and ensuring secure and efficient software delivery. The ideal candidate is a technical leader with deep expertise in automation, cloud operations, configuration management, and infrastructure-as-code (IaC). This role requires strong collaboration across engineering, security, product, and QA to enable a culture of continuous delivery, operational excellence, and system reliability. Qualifications: Bachelor’s or Master’s degree in Computer Science, Engineering, or a related technical discipline. 12+ years of overall experience in infrastructure engineering, DevOps, systems administration, or platform engineering. At least 3–5 years in a leadership role managing DevOps or infrastructure engineering teams. Hands-on expertise in cloud platforms (AWS, Azure, or GCP), with deep knowledge of networking, IAM, VPCs, storage, and compute services. Strong proficiency in Infrastructure as Code (IaC) using Terraform, Ansible, or equivalent. Experience building and managing CI/CD pipelines using tools such as Jenkins, GitLab CI, CircleCI, or ArgoCD. Strong background in Linux/Unix systems, system administration, scripting (e.g., Bash, Python, Go), and configuration management. Experience implementing containerization and orchestration using Docker, Kubernetes, Helm. Familiarity with observability tools and logging frameworks (e.g., ELK, Datadog, Fluentd, Prometheus, Grafana). Solid understanding of DevOps principles, Agile/Lean methodologies, and modern SDLC practices. Job Responsibilities: Responsibilities: Lead the DevOps function, establishing best practices in CI/CD, infrastructure automation, cloud scalability, and operational excellence. Design and manage robust CI/CD pipelines to support reliable, repeatable, and secure code releases across multiple environments. Architect and maintain high-availability, scalable, and secure infrastructure across cloud platforms (AWS, Azure, GCP) using Infrastructure as Code (IaC) tools such as Terraform, CloudFormation, or Pulumi. Champion DevSecOps principles, integrating security checks, compliance gates, and vulnerability scans into build and deployment processes. Drive adoption of containerization and orchestration tools such as Docker, Kubernetes, Helm, and service meshes like Istio or Linkerd. Lead infrastructure monitoring, alerting, and logging initiatives using tools like Prometheus, Grafana, ELK/EFK stack, Datadog, or New Relic. Collaborate with development, QA, and product teams to implement GitOps, blue/green deployments, canary releases, and rollback strategies. Manage cloud costs, optimize resource utilization, and oversee infrastructure capacity planning and scaling strategies. Own incident response, root cause analysis, and the design of self-healing and resilient infrastructure systems. Mentor and manage a team of DevOps engineers and SREs, facilitating technical growth, performance management, and team collaboration. Develop and maintain documentation, runbooks, and knowledge bases for DevOps processes and tooling. Keep current with industry trends and emerging technologies to recommend and implement forward-looking DevOps capabilities. What We Offer: Exciting Projects: We focus on industries like High-Tech, communication, media, healthcare, retail and telecom. Our customer list is full of fantastic global brands and leaders who love what we build for them. Collaborative Environment: You Can expand your skills by collaborating with a diverse team of highly talented people in an open, laidback environment — or even abroad in one of our global centers or client facilities! Work-Life Balance: GlobalLogic prioritizes work-life balance, which is why we offer flexible work schedules, opportunities to work from home, and paid time off and holidays. Professional Development: Our dedicated Learning & Development team regularly organizes Communication skills training(GL Vantage, Toast Master),Stress Management program, professional certifications, and technical and soft skill trainings. Excellent Benefits: We provide our employees with competitive salaries, family medical insurance, Group Term Life Insurance, Group Personal Accident Insurance , NPS(National Pension Scheme ), Periodic health awareness program, extended maternity leave, annual performance bonuses, and referral bonuses. Fun Perks: We want you to love where you work, which is why we host sports events, cultural activities, offer food on subsidies rates, Corporate parties. Our vibrant offices also include dedicated GL Zones, rooftop decks and GL Club where you can drink coffee or tea with your colleagues over a game of table and offer discounts for popular stores and restaurants!
Posted 4 weeks ago
2.0 - 4.0 years
0 Lacs
Hyderabad, Telangana, India
On-site
Inviting applications for the role of Senior Principal Consultant,Tech Engineer A tech engineer is responsible for designing, developing, and maintaining technology solutions that meet organizational needs. They play a crucial role in ensuring systems are efficient, reliable, and scalable to support business operations. Responsibilities Design and develop technology solutions: Create and implement software and hardware solutions tailored to business requirements. Troubleshoot and resolve issues: Identify and fix technical problems to ensure smooth operation of systems and applications. Collaborate with teams: Work closely with cross-functional teams, including IT, product development, and management, to align technology solutions with business goals. Optimize system performance: Continuously monitor and improve system performance for efficiency and scalability. Ensure security compliance: Implement security measures to protect data and systems from unauthorized access. Experience in GenAI project Qualifications we seek in you! Minimum Qualifications / Skills Education: Bachelor&rsquos degree in computer science, engineering, or a related field. Experience: Minimum of two years of experience in technology engineering or a similar role. Skills: Proficiency in programming languages like Java or Python, strong problem-solving skills, and familiarity with cloud platforms. . . . . .
Posted 4 weeks ago
5.0 - 10.0 years
15 - 18 Lacs
Ahmedabad
Work from Office
- Lead cross-functional teams to deliver IT projects on time and within budget. - Manage sprints, timelines, and risks. - Collaborate with clients and stakeholders. - Ensure project scope, technical quality, and documentation standards.
Posted 4 weeks ago
5.0 - 7.0 years
12 - 16 Lacs
Bengaluru
Work from Office
About the Role As an SRE (5 to 7 years) (Big Data) Engineer at PhonePe, you will be responsible for ensuring the stability, scalability, and performance of distributed systems operating at scale. You will collaborate with development, infrastructure, and data teams to automate operations, reduce manual efforts, handle incidents, and continuously improve system reliability. This role requires strong problem-solving skills, operational ownership, and a proactive approach to mentoring and driving engineering excellence. Roles and Responsibilities Ensure the ongoing stability, scalability, and performance of PhonePes Hadoop ecosystem and associated services. Manage and administer Hadoop infrastructure including HDFS, HBase, Hive, Pig, Airflow, YARN, Ranger, Kafka, Pinot, and Druid. Automate BAU operations through scripting and tool development. Perform capacity planning, system tuning, and performance optimization. Set-up, configure, and manage Nginx in high-traffic environments. Administration and troubleshooting of Linux + Bigdata systems, including networking (IP, Iptables, IPsec). Handle on-call responsibilities, investigate incidents, perform root cause analysis, and implement mitigation strategies. Collaborate with infrastructure, network, database, and BI teams to ensure data availability and quality. Apply system updates, patches, and manage version upgrades in coordination with security teams. Build tools and services to improve observability, debuggability, and supportability. Participate in Kerberos and LDAP administration. Experience in capacity planning and performance tuning of Hadoop clusters. Work with configuration management and deployment tools like Puppet, Chef, Salt, or Ansible. Skills Required Minimum 1 year of Linux/Unix system administration experience. Over 4 years of hands-on experience in Hadoop administration. Minimum 1 years of experience managing infrastructure on public cloud platforms like AWS, Azure, or GCP (optional ) . Strong understanding of networking, open-source tools, and IT operations. Proficient in scripting and programming (Perl, Golang, or Python). Hands-on experience with maintaining and managing the Hadoop ecosystem components like HDFS, Yarn, Hbase, Kafka . Strong operational knowledge in systems (CPU, memory, storage, OS-level troubleshooting). Experience in administering and tuning relational and NoSQL databases. Experience in configuring and managing Nginx in production environments. Excellent communication and collaboration skills. Good to Have Experience designing and maintaining Airflow DAGs to automate scalable and efficient workflows. Experience in ELK stack administration. Familiarity with monitoring tools like Grafana, Loki, Prometheus, and OpenTSDB. Exposure to security protocols and tools (Kerberos, LDAP). Familiarity with distributed systems like elasticsearch or similar high-scale environments. PhonePe Full Time Employee Benefits (Not applicable for Intern or Contract Roles) Insurance Benefits - Medical Insurance, Critical Illness Insurance, Accidental Insurance, Life Insurance Wellness Program - Employee Assistance Program, Onsite Medical Center, Emergency Support System Parental Support - Maternity Benefit, Paternity Benefit Program, Adoption Assistance Program, Day-care Support Program Mobility Benefits - Relocation benefits, Transfer Support Policy, Travel Policy Retirement Benefits - Employee PF Contribution, Flexible PF Contribution, Gratuity, NPS, Leave Encashment Other Benefits - Higher Education Assistance, Car Lease, Salary Advance Policy Working at PhonePe is a rewarding experience! Great people, a work environment that thrives on creativity, the opportunity to take on roles beyond a defined job description are just some of the reasons you should work with us. Read more about PhonePe on our blog. Life at PhonePe PhonePe in the news
Posted 4 weeks ago
5.0 - 10.0 years
11 - 16 Lacs
Pune
Work from Office
: Job Title Engineer Java Microservices, AS LocationPune, India Role Description As a Java Microservices engineer you would be responsible for designing, developing and maintaining scalable microservices using Java & Spring Boot. You will collaborate with cross-functional teams to deliver the features/enhancements in time by ensuring code quality and support the overall business requirements. What well offer you 100% reimbursement under childcare assistance benefit (gender neutral) Sponsorship for Industry relevant certifications and education Accident and Term life Insurance Your key responsibilities Develop and maintain scalable and reliable microservices using Java, Spring Boot and related technologies. Implement RESTful APIs and support integrations with other systems. Collaborate with various stakeholders QA, DevOps, PO and Architects to ensure the business requirements are met. Participate in code reviews, troubleshooting and mentoring junior members. Your skills and experience Must Have: Overall experience of 5+ years with hands-on coding/engineering skills extensively in Java technologies and microservices. Strong understanding of Microservices architecture, patterns and practices. Proficiency in Spring Boot, Spring Cloud, development of REST APIs Desirable skills that will help you excel Prior experience working in Agile/scum environment. Good understanding of containerization (Docker/Kubernetes), databases (SQL & No SQL), Build tools (Maven/Gradle). Experience in development using Python. Knowledge of the Architecture and Design Principles, Algorithms and Data Structures, and UI. Exposure to cloud platforms is a plus (preferably GCP). Knowledge of Kafka, RabbitMQ etc., would be a plus. Strong problem solving and communications skills. Working knowledge of GIT, Jenkins, CICD, Gradle, DevOps and SRE techniques Educational Qualifications Bachelors degree in Computer Science/Engineering or relevant technology & science Technology certifications from any industry leading cloud providers How well support you About us and our teams Please visit our company website for further information: https://www.db.com/company/company.htm
Posted 4 weeks ago
9.0 - 12.0 years
15 - 25 Lacs
Pune
Work from Office
Hi, We are looking for passionate IT Infrastructure Project Manager to be part of our team in Pune office. Below is the job description: What does a successful IT Infrastructure Project Manager do at Fiserv: Drive project planning, scheduling, and risk management to ensure timely and within-budget delivery. Monitor project progress using industry-standard tools and report key performance indicators to stakeholders. Ensure compliance with governance, security, and regulatory standards throughout the project lifecycle. Act as a liaison between business stakeholders and technical teams to translate client requirements into actionable solutions. What will you do: Lead end-to-end project management for: Client onboarding and transition initiatives, Hardware refresh cycles and infrastructure upgrades, Infrastructure setup (including data center and cloud environments) Collaborate with vendors for procurement and deployment of network, server, and connectivity infrastructure. Prepare cost proposals, technical documentation, and project plans in coordination with business and technical teams. Manage onboarding of new clients onto the companys payment processing platform, ensuring infrastructure readiness and compliance. Coordinate cross-functional teams (network engineers, cloud architects, security teams, etc.) to ensure smooth project execution and delivery. What will you need to know: Proven experience in managing IT infrastructure projects, including data center operations, server deployments, and network architecture. Hands-on knowledge of cloud platforms Understanding of ITIL processes, infrastructure lifecycle management, and vendor coordination. Excellent communication, stakeholder management, and documentation skills. Proficiency in project management tools (e.g., MS Project, JIRA, Confluence). What would be great to have: PMP or equivalent project management certification is a strong advantage. Experience in the fintech or payment processing industry is a plus. We welcome and encourage diversity in our workforce. Fiserv is an Equal Opportunity Employer. All qualified applicants will receive consideration for employment without regard to race, colour, religion, sex, sexual orientation, gender identity, national origin, disability, or status as a protector veteran. Explore the possibilities of a career with Fiserv and Find your Forward with us !
Posted 4 weeks ago
3.0 - 7.0 years
14 - 18 Lacs
Bengaluru
Work from Office
Scale an existing RAG code base for a production grade AI application Proficiency in Prompt Engineering, LLMs, and Retrieval Augmented Generation Programming languages like Python or Java Experience with vector databases Experience using LLMs in software applications, including prompting, calling, and processing outputs Experience with AI frameworks such as LangChain Troubleshooting skills and creative in finding new ways to leverage LLM Experience with Azure Proof of Concept (POC) DevelopmentDevelop POCs to validate and showcase the feasibility and effectiveness of the proposed AI solutions. Collaborate with development teams to implement and iterate on POCs, ensuring alignment with customer requirements and expectations. Help in showcasing the ability of Gen AI code assistant to refactor/rewrite and document code from one language to another, particularly COBOL to JAVA through rapid prototypes/ PoC Required education Bachelor's Degree Preferred education Master's Degree Required technical and professional expertise Strong programming skills, with proficiency in Python and experience with AI frameworks such as TensorFlow, PyTorch, Keras or Hugging Face. Understanding in the usage of libraries such as SciKit Learn, Pandas, Matplotlib, etc. Familiarity with cloud platforms (e.g. Kubernetes, AWS, Azure, GCP) and related services is a plus. Experience and working knowledge in COBOL & JAVA would be preferred Having experience in Code generation, code matching & code translation Prepare the effort estimates, WBS, staffing plan, RACI, RAID etc. . Excellent interpersonal and communication skills. Engage with stakeholders for analysis and implementation. Commitment to continuous learning and staying updated with advancements in the field of AI. Demonstrate a growth mindset to understand clients' business processes and challenges Preferred technical and professional experience EducationBachelor's, Master's, or Ph.D. degree in Computer Science, Artificial Intelligence, Data Science or a related field. Ability to communicate with internal and external clients to understand and define business needs, providing analytical solutions Ability to communicate results to technical and non-technical audiences
Posted 4 weeks ago
2.0 - 5.0 years
14 - 17 Lacs
Mumbai
Work from Office
As a Data Engineer at IBM, you'll play a vital role in the development, design of application, provide regular support/guidance to project teams on complex coding, issue resolution and execution. Your primary responsibilities include: Lead the design and construction of new solutions using the latest technologies, always looking to add business value and meet user requirements. Strive for continuous improvements by testing the build solution and working under an agile framework. Discover and implement the latest technologies trends to maximize and build creative solutions Required education Bachelor's Degree Preferred education Master's Degree Required technical and professional expertise Experience with Apache Spark (PySpark)In-depth knowledge of Spark’s architecture, core APIs, and PySpark for distributed data processing. Big Data TechnologiesFamiliarity with Hadoop, HDFS, Kafka, and other big data tools. Data Engineering Skills: Strong understanding of ETL pipelines, data modelling, and data warehousing concepts. Strong proficiency in PythonExpertise in Python programming with a focus on data processing and manipulation. Data Processing FrameworksKnowledge of data processing libraries such as Pandas, NumPy. SQL ProficiencyExperience writing optimized SQL queries for large-scale data analysis and transformation. Cloud PlatformsExperience working with cloud platforms like AWS, Azure, or GCP, including using cloud storage systems Preferred technical and professional experience Define, drive, and implement an architecture strategy and standards for end-to-end monitoring. Partner with the rest of the technology teams including application development, enterprise architecture, testing services, network engineering, Good to have detection and prevention tools for Company products and Platform and customer-facing
Posted 4 weeks ago
2.0 - 7.0 years
9 - 13 Lacs
Bengaluru
Work from Office
Educational Bachelor of Engineering Service Line Quality Responsibilities Data collection, profiling, EDA & data preparation AI Model development, tuning & validation Visualization tools (e.g., PowerBI, tableau) Present findings to business users & project management teams Propose ML based solution approaches and estimates for new use cases Contribute to AI based modules in Infosys solutions development Explore new advances in AI continuously and execute PoCs Mentoringguide other team members ML algorithms AI domainsNLP, speech, computer vision Supervised, Unsupervised, Reinforcement learning Tools for data analysis, auto ML, model deployment and scaling Knowledge of datasets ProgrammingPython Databases Knowledge of cloud platforms Azure/AWS/GCP. Preferred Skills: Technology-Machine Learning-AI/ML Solution Architecture and Design
Posted 4 weeks ago
8.0 - 11.0 years
14 - 19 Lacs
Bengaluru
Work from Office
Educational Bachelor of Engineering Service Line Strategic Technology Group Responsibilities Power Programmer is an important initiative within Global Delivery to develop a team of Full Stack Developers who will be working on complex engineering projects, platforms and marketplaces for our clients using emerging technologies., They will be ahead of the technology curve and will be constantly enabled and trained to be Polyglots., They are Go-Getters with a drive to solve end customer challenges and will spend most of their time in designing and coding, End to End contribution to technology oriented development projects., Providing solutions with minimum system requirements and in Agile Mode., Collaborate with Power Programmers., Open Source community and Tech User group., Custom Development of new Platforms & Solutions ,Opportunities., Work on Large Scale Digital Platforms and marketplaces., Work on Complex Engineering Projects using cloud native architecture ., Work with innovative Fortune 500 companies in cutting edge technologies., Co create and develop New Products and Platforms for our clients., Contribute to Open Source and continuously upskill in latest technology areas., Incubating tech user group Technical and Professional : Primary skills:Technology-Cloud Platform-Cloud Platform - ALL,Technology-Cloud Platform-Google Cloud - Architecture,Technology-Container Platform-Docker,Technology-Container Platform-KubernetesCloud Architecture & Design, Cloud Optimization & Automation, Innovation & Thought Leadership, Extensive experience with AWS, Azure, or GCP cloud platforms,Deep understanding of cloud computing concepts, including IaaS, PaaS, and SaaS, Strong experience with infrastructure as code (IaC) and DevOps practices,Experience with containerization and orchestration (Docker, Kubernetes), Strong knowledge of cloud security best practices and compliance standards, Industry certifications. Preferred Skills: Technology-Cloud Platform-Cloud Platform - ALL Technology-Container Platform-Docker Technology-Container Platform-Kubernetes Technology-Cloud Platform-Google Cloud - Architecture
Posted 4 weeks ago
Upload Resume
Drag or click to upload
Your data is secure with us, protected by advanced encryption.
Browse through a variety of job opportunities tailored to your skills and preferences. Filter by location, experience, salary, and more to find your perfect fit.
We have sent an OTP to your contact. Please enter it below to verify.
Accenture
39817 Jobs | Dublin
Wipro
19388 Jobs | Bengaluru
Accenture in India
15459 Jobs | Dublin 2
EY
14907 Jobs | London
Uplers
11185 Jobs | Ahmedabad
Amazon
10459 Jobs | Seattle,WA
IBM
9256 Jobs | Armonk
Oracle
9226 Jobs | Redwood City
Accenture services Pvt Ltd
7971 Jobs |
Capgemini
7704 Jobs | Paris,France