Get alerts for new jobs matching your selected skills, preferred locations, and experience range.
6.0 - 11.0 years
40 - 45 Lacs
Bengaluru
Work from Office
Join Team Amex and lets lead the way together. What were looking for: Youre a talented, creative, and motivated engineer who loves developing powerful, stable, and intuitive apps and youre excited to work with a team of individuals with that same passion. Youve accumulated years of experience, and youre excited about taking your mastery of Cloud, Big Data and Java to a new level. You enjoy challenging projects involving big data sets and are cool under pressure. Youre no stranger to fast-paced environments and agile development methodologies in fact, you embrace them. With your strong analytical skills, your unwavering commitment to quality, your excellent technical skills, and your collaborative work ethic, youll do great things here at American Express. Purpose of the Role: As Senior Engineer you will deliver technical solutions using cutting edge technologies and industry best practices to implement cloud platform. You should have experience in architecting, deploying, automating cloud-based data platforms. You are well versed in technologies such as GCP, Big data, Java, Scala, Spark, Kafka, APIs, Python, RDBMS, NoSQL etc. You are experienced in building low code/no code data transformation tools using cloud technologies such as GCP/ AWS or Azure. Be part of a fast-paced Agile team, design, develop, test, troubleshoot optimise solutions created to simplify access to the Amexs Big Data Platform. Responsibilities: As a Sr. Big Data Engineer, youll be responsible for designing and building high performance, and scalable data platforms You will be leading team of multiple very enthusiastic and skilled engineers to drive the product development and adoption You will be required to effectively collaborate with product teams from business group and understand the product roadmap and vision and translate that into engineering artefacts You will work with a variety of teams and individuals, including platform engineers, usecase owners, analytical users to understand their needs and come up with innovative solutions You will follow the Amex-way of building engineering products that leads to engineering excellence by adopting DevOps principals Qualifications: Bachelors degree in Computer Science, Engineering, or a related field. Masters degree would be a plus Great to have :- GCP professional certification - Data Engineer/Cloud Architect will be preferable 6+ years of software development experience with hands-on expertise in coding in Java/Python/Scala etc 3+ years of experience in creating low code/no code ETL tool for setting up large scale data transformation on GCP Cloud Expert in Google BigQuery tool for data warehousing needs Strong SQL, RDBMS skills. Expert in writing complex SQLs for different databases such as Hive , MySQL , Postgres etc. Proficiency in working with NoSQL databases as well Experience working with Spark , Big Data and Hive Experience in Git Management including PR reviews, maintaining code hygiene In-depth understanding of data warehousing concepts, dimensional modelling, and data integration techniques Experience in optimising high volume data processing jobs. 3+ years experience in writing APIs and spring boot services. Knowledge of High availability and DR setup. Hands-on experience on CICD pipelines, Automated test frameworks, DevOps and source code management will be a big plus (XLR, Jenkins, Git, Stash, Jira, Confluence, Splunk etc.) Experience working in Agile/SAFe framework for development Excellent communication and analytical skills Excellent team-player with ability to work with global team We back you with benefits that support your holistic well-being so you can be and deliver your best. This means caring for you and your loved ones physical, financial, and mental health, as well as providing the flexibility you need to thrive personally and professionally:
Posted 3 weeks ago
6.0 - 8.0 years
10 - 14 Lacs
Noida
Work from Office
Job Profile Summary: In this role, you will design, develop and provide support for Point of Sales platform/s for client to power Sales Process done by life Insurance Distribution channels. This is a hands-on software engineering role. We are looking for an Engineer who is passionate around solving business problems through innovation engineering practices. This role requires the candidate to have depth of knowledge and expertise that can be applied to all aspects of the software development lifecycle, as well as partner continuously with multiple stakeholders regularly to stay focused on common goals. As part of this dynamic role, you will work closely with business units and other IT teams to deliver leading edge technology to enable digital capabilities. Job Description: Deliver technical excellence Contribute to the translation of business requirements into well-architected software solutions Participate and at times facilitate technical discussions with the team Adhere and contribute to platform related technical standards and processes Develops data integration, migration and deployment strategies Deliver applications in accordance with relevant IT policies and procedures Adhere to company s Software Development Life Cycle (SDLC). Identify and solve complex problems collaboratively Contribute to the technical training development of the team Who we are looking for: Technical Skills Strong background in design/development ( functional and non-blocking programming ) and support of large web-based systems, complete software product lifecycle exposure Strong culture on Software Testing (Unit Test, TDD, BDD) Experience with Agile development methodologies (Scrum, Kanban, XP/eXtreme Programming) and Complexity estimation / planning poker Strong understanding of environment management, release management, code versioning, engineering best practices, and deployment methodologies Must have Experience with: Java Development Java 11+ RESTful APIs and Microservices development (Spring Boot, Spring Cloud) ReactJS v16.8+, Application State Management - Redux ES6, Proven experience with Advanced TypeScript concepts React Native for iOS and Android App development TDD/BDD (JUnit, Mockito and cucumber) and other testing frameworks like Jest and Enzyme RDBMS NoSQL Experience with transcompiler tools like Babel build tools like Webpack Exposure to performance measuring tools such as Profiler and performance optimization practices Exposure to component libraries such as Material-UI, Ant Design, etc. Code Version Control tools (GIT, Bitbucket) DevOps (CI/CD, Docker, Kubernetes) Cloud platform (AKS, API Gateway) Basic of Event Bus (confluent - Kafka) Personal Traits Excellent problem analysis skills. Innovative and creative in developing solutions Strong verbal and written communication skills Strong emphasis on teamwork and collaboration to deliver business value Passionate about delivery of quality software Strong sense of drive and commitment Strong sense of personal accountability Works well in a dynamic environment Education Bachelor s in computer science, Computer Engineering or equivalent/higher Language Fluent written and spoken English EXPERIENCE 6-8 Years SKILLS Primary Skill: CNA Development Sub Skill(s): CNA Development Additional Skill(s): Spring Boot Microservices, ReactJS, Core Java, Spring Boot, CSS, HTML, JavaScript Development, Unit Testing
Posted 3 weeks ago
5.0 - 6.0 years
13 - 18 Lacs
Hyderabad
Work from Office
Overall Purpose This position will manage a team of 10+ data engineers and interact on a consistent basis with other developers, architects, data product owners and source systems. This position requires multifaceted candidates who have experience in data engineering, data analysis, visualization, good hands-on experience PySpark Databricks, and in Azure Cloud Platform Services Key Roles and Responsibilities Lead and manage a team of Data engineers to Develop, understand, and enhance code in traditional data warehouse environments, data lake, and cloud environments like Snowflake, Azure, Databricks Build end-to-end data business intelligence solutions. This includes data extraction, ETL processes applied on data to derive useful business insights, and best representing this data through dashboards. Write complex SQL queries used to transform data using Python/Unix shell scripting Understand business requirements and create visual reports and dashboards using Power BI or Tableau. Upskill to different technologies, understand existing products and programs in place Work with other development and operations teams. Flexible with shifts and occasional weekend support. Key Competencies Full life-cycle experience on enterprise software development projects. Experience in relational databases/ data marts/data warehouses and complex SQL programming. Extensive experience in ETL, shell or python scripting, data modelling, analysis, and preparation Experience in Unix/Linux system, files systems, shell scripting. Extensive knowledge in Python, PySpark, Databricks, Azure Cloud Platform Services Good to have experience in BI Reporting tools Power BI or Tableau Good problem-solving and analytical skills used to resolve technical problems. Must possess a good understanding of business requirements and IT strategies. Experience in presentation design, development, delivery, and good communication skills to present analytical results and recommendations for action-oriented data driven decisions and associated operational and financial impacts. Experience in managing a team of 10 or more, is involved in appraisal and rating process Required/Desired Skills Cloud Platforms - Azure, Databricks, Deltalake (Required 5-6 years) SQL Programming, Python/PySpark ETL (Required 8+ Years) Unix/Linux shell scripting (Required 6-8 years) RDBMS and Data Warehousing (Required 8+ Years) Iceberg enablement (Desired 2-3 years) Managed a team for 2+ years Snowflake Power BI / Tableau (Good to have)
Posted 3 weeks ago
4.0 - 9.0 years
10 - 11 Lacs
Hyderabad, Bengaluru
Work from Office
No. of. Positions 4 Experience 4 to 9 Years Hyderabad /Bangalore Role requirements: Solid understanding of Salesforce.com Architecture, Design, Development, Administration and Operational Support Analyze requirements and designing solutions that are achievable, acceptable and consistent with customer expectations and good architectural principles. Provide development and administration support on Salesforce platform including Development and support on standard Salesforce functionality including page layouts, field additions, and permissions Advanced development and administration which would include triggers, S-controls, workflow, validation rules, Lightning components, etc. Code development using APEX and Force.com. Development using existing and upcoming Salesforce frameworks like Lightning as well as custom frameworks Assist the Support team in troubleshooting and resolving technical issues. Provide timely status updates and identify as well as communicate risks in a timely manner Work closely with business analysts and/or key business community users regarding enhancements and bug fixes. Ensure consistent delivery of IT solutions by following Informatica s development standards and the architecture framework. Work effectively in a distributed team Demonstrated deep technical knowledge with a minimum of 3 years experience working with Force.com developer toolkit Apex, Visualforce, Lightning, Force.com IDE, Force.com Migration Tool, Web Services/SOA Metadata APIs SQL, RDBMS experience Strong communication skills both verbal and written. Demonstrated strong prototyping, coding and debugging skill Required Candidate login to applying this job. Click here to And try again Login to your account Email Address: Password: Salesforce Developer Drop your resume or click to upload File types supported: .doc, .docx and .pdf | Max file size: 3 MB or First Name: * Last Name: * Email: * Phone: * Current Job Title: Online resume/portfolio link Address Country State Postal code How soon can you join* How did you hear about us* By clicking checkbox, you agree to our and
Posted 3 weeks ago
3.0 - 5.0 years
7 - 10 Lacs
Kolkata
Hybrid
Intermediate understanding of Docker & Kubernetes Fundamental understanding of Python & Java. Exp in working on Ansible. Good knowledge of shell scripting. Exp in working on Linux-based architecture, RDBMS,Spark,Elastic Search,NoSql
Posted 3 weeks ago
5.0 - 8.0 years
7 - 10 Lacs
Kolkata, Mumbai, New Delhi
Work from Office
GCP Java Developer 5-8 YRS PUNE/HYDERABAD/BANGALORE Looking for a workplace where people realize their full potential, are recognized for the impact they make, and enjoy the company of the peers they work with? Welcome to Zensar! Read on for more details on the role and about us. Skills required to contribute: Mandatory Experience: Has experience between 5 to 8 years in Java, spring boot, Microservices development Should have experience in GCP Experience of building a range of Services on Google Cloud Expert understanding of service for data and analytics in GCP (Big Table, Big Query, GCS etc ) from operational standpoint Knowledge of Adobe Experience Platform preferred. Good knowledge of RDBMS, SQL queries and NOSQL Excellent communication and inter-personal skills Experience in agile methodologies, acquaintance with Jira/confluence. Certification on Google Cloud will be an added advantage. A graduation or equivalent formal education in Computer Science/IT Has strong technical expertise in core java, spring framework, JPA, hibernate etc. Good knowledge and experience on Docker container, Kubernetes is preferred. Should be able to perform code reviews and guide the junior team members Advantage Zensar We are a digital solutions and technology services company that partners with global organizations across industries to achieve digital transformation. With a strong track record of innovation, investment in digital solutions, and commitment to client success, at Zensar, you can help clients achieve new thresholds of performance. A subsidiary of RPG Group, Zensar has its HQ in India, and offices across the world, including Mexico, South Africa, UK and USA. Zensar is all about celebrating individuality, creativity, innovation, and flexibility. We hire based on values, talent, and the potential necessary to fill a given job profile, irrespective of nationality, sexuality, race, color, and creed. We also put in policies to empower this assorted talent pool with the right environment for growth. At Zensar, you Grow, Own, Achieve, Learn. Learn more about our culture: https: / / www.zensar.com / careers / who-we-are Ready to #ExperienceZensar? Begin your application by clicking on the Apply Online button below. Be sure to have your resume handy! If you re having trouble applying, drop a line to careers@zensar.com. GCP Java Developer 5-8 YRS PUNE/HYDERABAD/BANGALORE Looking for a workplace where people realize their full potential, are recognized for the impact they make, and enjoy the company of the peers they work with? Welcome to Zensar! Read on for more details on the role and about us. Skills required to contribute: Mandatory Experience: Has experience between 5 to 8 years in Java, spring boot, Microservices development Should have experience in GCP Experience of building a range of Services on Google Cloud Expert understanding of service for data and analytics in GCP (Big Table, Big Query, GCS etc ) from operational standpoint Knowledge of Adobe Experience Platform preferred. Good knowledge of RDBMS, SQL queries and NOSQL Excellent communication and inter-personal skills Experience in agile methodologies, acquaintance with Jira/confluence. Certification on Google Cloud will be an added advantage. A graduation or equivalent formal education in Computer Science/IT Has strong technical expertise in core java, spring framework, JPA, hibernate etc. Good knowledge and experience on Docker container, Kubernetes is preferred. Should be able to perform code reviews and guide the junior team members Advantage Zensar We are a digital solutions and technology services company that partners with global organizations across industries to achieve digital transformation. With a strong track record of innovation, investment in digital solutions, and commitment to client success, at Zensar, you can help clients achieve new thresholds of performance. A subsidiary of RPG Group, Zensar has its HQ in India, and offices across the world, including Mexico, South Africa, UK and USA. Zensar is all about celebrating individuality, creativity, innovation, and flexibility. We hire based on values, talent, and the potential necessary to fill a given job profile, irrespective of nationality, sexuality, race, color, and creed. We also put in policies to empower this assorted talent pool with the right environment for growth. At Zensar, you Grow, Own, Achieve, Learn. Learn more about our culture: https: / / www.zensar.com / careers / who-we-are Ready to #ExperienceZensar? Begin your application by clicking on the Apply Online button below. Be sure to have your resume handy! If you re having trouble applying, drop a line to careers@zensar.com.
Posted 3 weeks ago
8.0 - 13.0 years
25 - 30 Lacs
Kolkata, Mumbai, New Delhi
Work from Office
What Youll Do Job Description: You will Provide 24/7 administrative support (on-prime and Atlas Cloud) on MongoDB Clusters, Postgres & Snowflake Provide support for on-prime and Confluent Cloud Kafka Clusters You will Review database designs to ensure all technical and our requirements are met. Perform database Optimization, testing to ensure Service level agreements are met. You will support during system implementation and in production Provide Support for Snowflake Administrative Tasks (Data Pipeline, Object creation, Access) Participate in Weekdays and Weekend Oncall Rotation to support Products running on Mongo, SQL, Kafka & Snowflake, and other RDBMS Systems. This roles does not have any managerial responsibilities. Its an individual contributor role. You will report to Sr. Manager Reliability Engineering. What Your Responsibilities Will Be 8+ years of experience in Managing MongoDB on-prime and Atlas Cloud Be an part of the database team in developing next-generation database systems. Provide services in administration and performance monitoring of database related systems. Develop system administration standards and procedures to maintain practices. Support backup and recovery strategies. Provide in the creative process to improving architectural designs and implement new architectures Expertise in delivering efficiency and cost effectiveness. Monitor and support capacity planning and analysis. Monitor performance, troubleshoot issues and proactively tune database and workloads. Sound knowledge Terraform, Grafana and Manage Infra as a code using Terraform & Gitlab. Ability to work remotely. What Youll Need to be Successful Working knowledge of MongoDB (6.0 or above). Experience with Sharding and Replica sets. Working knowledge of database installation, setup, creation, and maintenance processes. Working knowledge on Change Streams and Mongo ETLs to replicate live changes to downstream Analytics systems. Experience running MongoDB in containerized environment (EKS clusters) Support Reliability Engineering task for all other database platform (SQL, MYSQL, Postgres, Snowflake, Kafka). Experience with Cloud or Ops Manager (a plus) Understand Networking components on aws and gcp cloud. Technical knowledge of Backup/ Recovery. Disaster Recovery and High Availability techniques Strong technical knowledge in writing shell scripts used to support database administration. Good Understanding of Kafka and Snowflake Administration. Good Understanding of Debezium, Kafka, Zookeeper and Snowflake is plus. . Automate Database Routine tasks Independently with shell, python and other languages. #LI-Onsite How Well Take Care of You Total Rewards In addition to a great compensation package, paid time off, and paid parental leave, many Avalara employees are eligible for bonuses. Health & Wellness Benefits vary by location but generally include private medical, life, and disability insurance. Inclusive culture and diversit y Avalara strongly supports diversity, equity, and inclusion, and is committed to integrating them into our business practices and our organizational culture. We also have a total of 8 employee-run resource groups, each with senior leadership and exec sponsorship. Learn more about our benefits by region here: Avalara North America What You Need To Know About Avalara We re Avalara. We re defining the relationship between tax and tech. We ve already built an industry-leading cloud compliance platform, processing nearly 40 billion customer API calls and over 5 million tax returns a year, and this year we became a billion-dollar business . Our growth is real, and we re not slowing down until we ve achieved our mission - to be part of every transaction in the world. We re bright, innovative, and disruptive, like the orange we love to wear. It captures our quirky spirit and optimistic mindset. It shows off the culture we ve designed, that empowers our people to win. Ownership and achievement go hand in hand here. We instill passion in our people through the trust we place in them. We ve been different from day one. Join us, and your career will be too. We re An Equal Opportunity Employer Supporting diversity and inclusion is a cornerstone of our company we don t want people to fit into our culture, but to enrich it. All qualified candidates will receive consideration for employment without regard to race, color, creed, religion, age, gender, national orientation, disability, sexual orientation, US Veteran status, or any other factor protected by law. If you require any reasonable adjustments during the recruitment process, please let us know. Working knowledge of MongoDB (6.0 or above). Experience with Sharding and Replica sets. Working knowledge of database installation, setup, creation, and maintenance processes. Working knowledge on Change Streams and Mongo ETLs to replicate live changes to downstream Analytics systems. Experience running MongoDB in containerized environment (EKS clusters) Support Reliability Engineering task for all other database platform (SQL, MYSQL, Postgres, Snowflake, Kafka). Experience with Cloud or Ops Manager (a plus) Understand Networking components on aws and gcp cloud. Technical knowledge of Backup/ Recovery. Disaster Recovery and High Availability techniques Strong technical knowledge in writing shell scripts used to support database administration. Good Understanding of Kafka and Snowflake Administration. Good Understanding of Debezium, Kafka, Zookeeper and Snowflake is plus. . Automate Database Routine tasks Independently with shell, python and other languages. #LI-Onsite 8+ years of experience in Managing MongoDB on-prime and Atlas Cloud Be an part of the database team in developing next-generation database systems. Provide services in administration and performance monitoring of database related systems. Develop system administration standards and procedures to maintain practices. Support backup and recovery strategies. Provide in the creative process to improving architectural designs and implement new architectures Expertise in delivering efficiency and cost effectiveness. Monitor and support capacity planning and analysis. Monitor performance, troubleshoot issues and proactively tune database and workloads. Sound knowledge Terraform, Grafana and Manage Infra as a code using Terraform & Gitlab. Ability to work remotely.
Posted 3 weeks ago
5.0 - 10.0 years
15 - 16 Lacs
Bengaluru
Work from Office
We have an opportunity to impact your career and provide an adventure where you can push the limits of whats possible. As a Lead Software Engineer at JPMorgan Chase within the Chief Technology Office, you are an integral part of an agile team that works to enhance, build, and deliver trusted market-leading technology products in a secure, stable, and scalable way. As a core technical contributor, you are responsible for conducting critical technology solutions across multiple technical areas within various business functions in support of the firm s business objectives. Job responsibilities Executes creative software solutions, design, development, and technical troubleshooting with ability to think beyond routine or conventional approaches to build solutions or break down technical problems Develops secure high-quality production code, and reviews and debugs code written by others Identifies opportunities to eliminate or automate remediation of recurring issues to improve overall operational stability of software applications and systems Leads evaluation sessions with external vendors, startups, and internal teams to drive outcomes-oriented probing of architectural designs, technical credentials, and applicability for use within existing systems and information architecture Leads communities of practice across Software Engineering to drive awareness and use of new and leading-edge technologies Adds to team culture of diversity, equity, inclusion, and respect Required qualifications, capabilities, and skills Formal training or certification on Software engineering concepts and 5+ years applied experience Experience in system design, application development, testing, and ensuring operational stability for traditional distributed systems. Experience in developing on blockchain platforms like Ethereum and Hyperledger. Proficient in object-oriented programming languages, with hands-on experience in application programming, backend API development, and distributed microservices infrastructure using one or more languages , Solidity, Java, JavaScript, Python, Go. Understanding of cryptographic principles that support blockchain technologies, including encryption algorithms, key management, and key wallets. Experience with cloud infrastructure services such as Amazon, Google, or Microsoft. Independently address design and functionality challenges with minimal supervision Preferred qualifications, capabilities, and skills Familiar with designing and writing smart contracts with Solidity using Hardhat, Truffle, Slither scanning etc Familiar with AWS, EKS, Docker Familiarity with RDBMS is advantageous. DevOps practices and tools for continuous integration and deployment.
Posted 3 weeks ago
5.0 - 10.0 years
50 - 55 Lacs
Hyderabad
Work from Office
As a Lead Software Engineer at JPMorgan Chase within the Consumer Community Banking/Firmwide Core Deposits team you are part of an agile team that works to enhance, design, and deliver the software components of the firm s state-of-the-art technology products in a secure, stable, and scalable way. As an emerging member of a software engineering team, you execute software solutions through the design, development, and technical troubleshooting of multiple components within a technical product, application, or system, while gaining the skills and experience needed to grow within your role. Job responsibilities Lead and contribute to the development of server-side applications using Java and Spring Design and envelop scalable architecture, ensuring performance efficiency, and resiliency. Implement message and command driven framework, with a preference for Kafka Develop and execute automated testing strategies, including end-to-end testing. Collaborate effectively within an agile team setting and communicating with key stakeholders. Provide devOps production support to ensure application stability and address business queries. Mentor junior developers and drive design and code review sessions. Adds to team culture of diversity, equity, inclusion, and respect. Required qualifications, capabilities, and skills Formal training or certification on software engineering concepts and 5+ years applied experience Hands-on practical experience in system design, application development, testing, and operational stability Proficient in coding in one or more languages. Experience in Java/J2EE, Spring Boot Overall knowledge of the Software Development Lifecycle Solid understanding of Agile methodologies such as CI/CD, Application Resiliency, and Security Demonstrated knowledge of software applications and technical processes within a technical discipline (eg, cloud, artificial intelligence, machine learning, mobile, etc) Preferred qualifications, capabilities, and skills Familiarity with DB2, JCL, IBM WebSphere MQ messaging solution. Experience with Cloud (AWS) Experience with Application security, performance engineering, and integrations Familiar with (Python, Kafka, Maven, Git, RDBMS, and Docker/Kubernetes)
Posted 3 weeks ago
1.0 - 4.0 years
5 - 6 Lacs
Noida
Work from Office
We are looking for and experienced Java Developer or Architect with strong technical expertise to design and lead development of scalable, high performance Java applications. The ideal candidate should have in depth understanding of Java/J2ee technologies, Design Pattern, Microservice Architecture, Docker Kubernetes, Integration Framework. This role requires design skills, excellent problem-solving skills, and the ability to collaborate with cross-functional teams, including DevOps, and Front-End developers. What you will do: Architect, design, and implement back-end solutions using Java/J2ee, Spring MVC, Spring Boot and related frameworks. Design, develop and maintain scalable Java components using REST or SOAP based Web Services Design develop enterprise solution with Messaging or Streaming Framework like ActiveMQ, HornetQ Kafka Work with Integration Framework like Apache Camel/Jboss Fuse/Mule ESB/EAI/Spring Integration. Make effective use of Caching Technologies (like Hazlecast /Redis /Infinispan /EHCache /MemCache) in application to handle large volume of data set. Deploy the application in Middleware or App Server (like Jboss/Weblogic/tomcat) Collaborate with the DevOps team to manage builds and CI/CD pipelines using Jira, GitLab, Sonar and other tools. The skills you bring: Strong expertise in Java/J2ee, Springboot Micriservices. Good understanding of Core Java concepts (like Collections Framework, Object Oriented Design) Experience in working with Multithreading Concepts (like Thread Pool, Executor Service, Future Task, Concurrent API, Countdown Latch) Detailed working exposure of Java8 with Stream API, Lambda, Interface, Functional Interfaces Proficiency in Java Web Application Development using Spring MVC Spring Boot Good Knowledge about using Data Access Frameworks using ORM (Hibernate JPA) Familiar with Database concepts with knowledge in RDBMS/SQL Good understanding of Monolithic Microservice Architecture
Posted 3 weeks ago
5.0 - 10.0 years
5 - 9 Lacs
Pune
Work from Office
Essential (Hands-On) Skills : .Net Core, .Net 6/8 SOLID principles Unit Testing (MsTest/NUnit/xUnit or similar) Using Mock Framework ( Moq/NSubstitute or similar) OOAD Design Patterns - For tech lead, Senior Developer Logger (SEQ or similar) SDLC : Agile Scrum Database : RDBMS ( MSSQL/PostgresSQL or similar) Preferrable Skills/Exposure : Rest API Development Unit Testing Swagger/ OpenAPI - For tech lead, Senior Developer Cloud Infra(AWS) (Lambda functions, Gateway, Authentication / Interospection / Identity) Log Management (Splunk/DataDog or similar) Backends-for-Frontends (BFF) API Architecture. For tech lead, Senior Developer PowerShell scripting
Posted 3 weeks ago
7.0 - 12.0 years
14 - 19 Lacs
Bengaluru
Work from Office
Junipers Supply Chain Operations is a data-driven organization and the demand for Data Engineering, Data Science and Analytics solutions for decision-making has increased 4x over the last 3 years. In addition, continuous changes in regulatory environment and geo-political issues call for a very flexible and resilient supply chain requiring many new data driven use cases. We need a self-motivated team player for this critical role in the Data Analytics Team to continue to satisfy and fulfill the growing demand for data and data driven solutions including developing AI solutions on top of SCO data stack. Responsibilities: As a member of the SCO Analytics team, this role will be responsible for implementing and delivering Business Intelligence initiatives in supply chain operations. This role will be responsible for collaborating with key business users, developing key metrics and reports and preparing the underlying data using new automated data preparation tools like Alteryx. etc This role will also interface with Juniper Enterprise IT for seamless delivery of integrated solutions. Major responsibilities include leading/delivering Data Science Business Intelligence initiatives in supply chain operations, collaborating with key business users, developing insightful analytical models, metrics and reports, coordinating with Juniper Enterprise IT for seamless delivery of system-based solutions. Minimum Qualifications: bachelors degree 7 + years Hands on skills and understanding of Reporting Solutions and Data Models Building end-end Data Engineering pipelines for Semi and unstructured data (Text, all kinds of simple/complex table structures, images, video and audio data) Python, Pyspark, SQL, RDBMS Data Transformation (ETL/ELT) activities SQL Data warehouse (eg Snowflake) working / preferably administration Techno-functional system analysis skills including requirements documentation, use case definition, testing methodologies Experience in managing Data Quality and Data Catalog solutions Ability to learn and adapt the Juniper end to end business processes Strong interpersonal, written and verbal communication Preferred Qualifications: Working Experience in analytics solutions like Snowflake, Tableau, Databricks, Alteryx and SAP Business Objects Tools is preferred. Understanding of Supply Chain business processes and its integration with other areas of business Personal Skills: Ability to collaborate cross-functionally and build sound working relationships within all levels of the organization Ability to handle sensitive information with keen attention to detail and accuracy. Passion for data handling ethics. Effective time management skills and ability to solve complex technical problems with creative solutions while anticipating stakeholder needs and helping meet or exceed expectations Comfortable with ambiguity and uncertainty of change when assessing needs for stakeholders Self-motivated and innovative; confident when working independently, but an excellent team player with a growth-oriented personality Other Information: Relocation is not available for this position Travel requirements for the position 10%
Posted 3 weeks ago
0.0 - 1.0 years
0 Lacs
Bengaluru
Work from Office
Who are we Wabtec Corporation is a leading global provider of equipment, systems, digital solutions, and value-added services for freight and transit rail as well as the mining, marine, and industrial markets. Drawing on nearly four centuries of collective experience across Wabtec, GE Transportation, and Faiveley Transport, the company has grown to become One Wabtec, with unmatched digital expertise, technological innovation, and world-class manufacturing and services, enabling the digital-rail-and-transit ecosystems. Wabtec is focused on performance that drives progress and unlocks our customers potential by delivering innovative and lasting transportation solutions that move and improve the world. We are lifelong learners obsessed with making things better to drive exceptional results. Wabtec has approximately 27K employees in facilities throughout the world. Visit our website to learn more! Overview The intern will be part of the products in Digital Solutions (Ex. Ports, Rail domain) and will be working as part of the engineering team delivering functional components as needed by the product management. The intern is expected to have a basic SDLC knowledge Assist with the development, troubleshooting and deployment of the software projects. Reports progress regularly to key stakeholders Key Deliverables Expectations Take ownership of module delivery under the guidance of assigned mentor. Collaborate well with the team members Develop code, write unit test and integration test cases. Work closely with required teams to ensure business functionality is delivered on time Participate in technical discussions and contribute to form of presentations or Proof of concepts. Flexible and Quick learner Be an excellent problem solver. Technical and Soft Skills Should have knowledge on programming (C/C++/Java/Python) Should have knowledge on frontend technologies such as Angular JS, Java Scripts, etc. Possess good understanding of RDBMS Knowledge of Cloud Related technologies like AWS will be add on Good knowledge on Data structure, Algorithms is required. Our Commitment to Embrace Diversity: To fulfill that commitment, we rely on a culture of leadership, diversity, and inclusion. We aim to employ the world s brightest minds to help us create a limitless source of ideas and opportunities. We have created a space where everyone is given the opportunity to contribute based on their individual experiences and perspectives and recognize that these differences and diverse perspectives make us better. We believe in hiring talented people of varied backgrounds, experiences, and styles People like you! Wabtec Corporation is committed to equal employment opportunity regardless of race, color, ancestry, religion, sex, national origin, sexual orientation, age, citizenship, marital status, disability, gender identity or expression, or protected Veteran status. If you have a disability or special need that requires accommodation, please let us know. Who are we Wabtec Corporation is a leading global provider of equipment, systems, digital solutions, and value-added services for freight and transit rail as well as the mining, marine, and industrial markets. Drawing on nearly four centuries of collective experience across Wabtec, GE Transportation, and Faiveley Transport, the company has grown to become One Wabtec, with unmatched digital expertise, technological innovation, and world-class manufacturing and services, enabling the digital-rail-and-transit ecosystems. Wabtec is focused on performance that drives progress and unlocks our customers potential by delivering innovative and lasting transportation solutions that move and improve the world. We are lifelong learners obsessed with making things better to drive exceptional results. Wabtec has approximately 27K employees in facilities throughout the world. Visit our website to learn more! http://www.WabtecCorp.com Our Commitment to Embrace Diversity: Wabtec is a global company that invests not just in our products, but also our people by embracing diversity and inclusion. We care about our relationships with our employees and take pride in celebrating the variety of experiences, expertise, and backgrounds that bring us together. At Wabtec, we aspire to create a place where we all belong and where diversity is welcomed and appreciated. To fulfill that commitment, we rely on a culture of leadership, diversity, and inclusion. We aim to employ the world s brightest minds to help us create a limitless source of ideas and opportunities. We have created a space where everyone is given the opportunity to contribute based on their individual experiences and perspectives and recognize that these differences and diverse perspectives make us better. We believe in hiring talented people of varied backgrounds, experiences, and styles People like you! Wabtec Corporation is committed to equal employment opportunity regardless of race, color, ancestry, religion, sex, national origin, sexual orientation, age, citizenship, marital status, disability, gender identity or expression, or protected Veteran status. If you have a disability or special need that requires accommodation, please let us know.
Posted 3 weeks ago
10.0 - 15.0 years
14 - 18 Lacs
Hyderabad, Bengaluru
Work from Office
The Data Excellence Data Architect is a demonstrated expert in technical and/or functional aspects of customer and partner engagements that lead to the successful delivery of data management projects. The Data Architect plays a critical role for setting customers up for success by prescriptively helping to shape and then execute in the Salesforce data space. This role also provides subject matter expertise related to the data management solutions and ensures successful project delivery. This includes helping identify and proactively manage risk areas, and ensuring issues are seen through to complete resolution as it relates to implementations. Will have the ability to configure and drive solutions to meet the customer s business and technical requirements. Additionally, this role will include helping align on the development of client-specific implementation proposals, SOWs, and staffing plans, engaging with SMEs across the organization to gain consensus on an acceptable proposal, developing best practices within the data excellence community, developing of shared assets. Responsibilities Serve as the Subject Matter Expert for Salesforce data excellence practice Recognized as a valuable and trusted advisor by our customers and other members of Salesforce community and continue to build a reputation for excellence in professional services Lead development of multi-year Data platform capabilities roadmaps for internal business units like Marketing, Sales, Services, and Finance. Facilitate enterprise information data strategy development, opportunity identification, business cases, technology adoption opportunities, operating model development, and innovation opportunities. Maximize value derived from data analytics by leveraging data assets through data exploitation, envisioning data-enabled strategies as well as enabling business outcomes through analytics, data analytics governance, and enterprise information policy. Translating business requirements into technical specifications, including data streams, integrations, transformations, databases, and data warehouses Defining the data architecture framework, standards and principles, including modeling, metadata, security, reference data such as product codes and client categories, and master data such as clients, vendors, materials, and employees Defining data flows, i.e., which parts of the organization generate data, which require data to function, how data flows are managed, and how data changes in transition Design and implement effective data solutions and models to store and retrieve data from different data sources Prepare accurate dataset, architecture, and identity mapping design for execution and management purposes. Examine and identify data structural necessities by evaluating client operations, applications, and programming. Research and properly evaluate new sources of information and new technologies to determine possible solutions and limitations in reliability or usability Assess data implementation procedures to ensure they comply with internal and external regulations. Lead or participate in the architecture governance, compliance, and security activities (architectural reviews, technology sourcing) to ensure technology solutions are consistent with the target state architecture. Partner with stakeholders early in the project lifecycle to identify business, information, technical, and security architecture issues and act as a strategic consultant throughout the technology lifecycle. Oversee the migration of data from legacy systems to new solutions. Preferred Qualifications and Skills: BA/BS degree or foreign equivalent Overall 10+ years of experience in Marketing data Data management space. Minimum 1 year of hands-on full lifecycle CDP implementation experience on platforms like Salesforce CDP(formerly 360 Audiences), Tealium AudienceStream, Adobe AEP, Segment, Arm Treasure Data, BlueShift, SessionM, RedPoint, etc. 5+ years of experience with data management, data transformation, ETL, preferably using cloud-based tools/infrastructure Experience with Data architecture (ideally with marketing data) using batch and/or real-time ingestion Relevant Salesforce experience in Sales Service Cloud as well as Marketing Cloud, related certifications is a plus (Marketing Cloud Consultant, Administrator, Advanced Administrator, Service Cloud Consultant, Sales Cloud Consultant, etc.) Experience with Technologies and Processes for Marketing, Personalization, and Data Orchestration. Experience with master data management (MDM), data governance, data security, data quality and related tools desired. Demonstrate deep data integration and/or migration experience with Salesforce.com and other cloud-enabled tools Demonstrate expertise in complex SQL statements and RDBMS systems such as Oracle, Microsoft SQL Server, PostGres Demonstrate experience with complex coding through ETL tools such as Informatica, SSIS, Pentaho, and Talend Knowledge of Data Governance and Data Privacy concepts and regulations a plus Required Skills Ability to work independently and be a self-starter Comfort and ability to learn new technologies quickly thoroughly Specializes in gathering and analyzing information related to data integration, subscriber management, and identify resolution Excellent analytical problem-solving skills Demonstrated ability to influence a group audience, facilitate solutions and lead discussions such as implementation methodology, Road-mapping, Enterprise Transformation strategy, and executive-level requirement gathering sessions Travel to client site (up to 50%)
Posted 3 weeks ago
2.0 - 6.0 years
7 - 11 Lacs
Hyderabad
Work from Office
License Management Services Specialist, Legal @ Hyderabad, Bengaluru - Progress Careers License Management Services Specialist License Management Services Specialist Share this open position Job Summary We are Progress (Nasdaq: PRGS) - a/ the trusted provider of software that enables our customers to develop, deploy and manage responsible, AI powered applications and experiences with agility and ease. We re proud to have a diverse, global team where we value the individual and enrich our culture by considering varied perspectives because we believe people power progress. Join us as a License Management Services Specialist and help us do what we do best: propelling business forward. This role will be performing impartial and co-operative license reviews within Progress customer base, located across the USA/EMEA regions. The Technical Analyst will be actively participating in the license reviews (audits) that are focused on helping our customers (and partners) manage their Progress Software environment on an on-going basis. In this role, you will : Act as a License Management expert in Software Asset Management activities and advise on the suitability of licensing structures to optimize the Customers business and system environments. Perform daily license reviews for Progress customers to determine and monitor software usage, ensuring compliance or addressing any commercial issues arising from exceeding license entitlements. Take ownership and drive the resolution of any identified issues. Conduct comprehensive and accurate analyses of collected data, handling large volumes efficiently. Analyze and interpret software licensing agreements. Develop and maintain a high level of understanding of the products, their associated licensing terms, conditions, metrics, and business practices and policies. Work on multiple license management-related projects as required. Generate ad-hoc reports as needed. Build and maintain strong working relationships with other internal departments such as Sales, Finance, Legal, Customer Order Management, and Support. Your background: University degree (or equivalent) preferably related to information technology field /a technical major Software background, knowledge of basic Object-oriented programming languages and understanding of relational database management systems (RDBMS) such as Microsoft SQL Server, MySQL, Oracle Database, OpenEdge) is not a must but will be considered an advantage Pre-sales experience in a software company will be considered an advantage Excellent English verbal and written communication skills are mandatory Customer focused experience with excellent communication and interpersonal skills within the software industry A good understanding of Client Operating Systems and Network Operation Systems Ability to work both independently and in a team environment Ability to prioritize tasks and handle multiple projects simultaneously Excellent research skills and ability to resolve complex problems Experience in dealing with customer contracts is preferred Familiarity with basic license management technical tools and processes is preferred Experience in license compliance and/or auditing practices is preferred If this sounds like you and fits your experience and career goals, we d be happy to chat. What we offer in return is the opportunity to experience a great company culture with wonderful colleagues to learn from and collaborate with, and also to enjoy: Compensation Competitive remuneration package Employee Stock Purchase Plan Enrolment Vacation, Family, and Health 30 days of earned leave An extra day off for your birthday Various other leaves like marriage leave, casual leave, maternity leave, and paternity leave Premium Group Medical Insurance for employees and five dependents, personal accident insurance coverage, and life insurance coverage Professional development reimbursement Interest subsidy on loans - either vehicle or personal loans. Apply now!
Posted 3 weeks ago
4.0 - 9.0 years
4 - 7 Lacs
Bengaluru
Work from Office
We are looking forward to hire AWS Glue Professionals in the following areas : 4 or more years experience in AWS Glue + Redshift + Python 4+ years of experience in engineering with experience in ETL type work with cloud databases Data management / data structures Must be proficient in technical data management tasks, i.e. writing code to read, transform and store data Spark Experience in launching spark jobs in client mode and cluster mode. Familiarity with the property settings of spark jobs and their implications to performance. SCC/Git Must be experienced in the use of source code control systems such as Git ETL Experience with developing ELT/ETL processes with experience in loading data from enterprise sized RDBMS systems such as Oracle, DB2, MySQL, etc. Programming Must be at able to code in Python or expert in at least one high level language such as Java, C, Scala. Must have experience in using REST APIs SQL Must be an expert in manipulating database data using SQL. Familiarity with views, functions, stored procedures and exception handling. AWS General knowledge of AWS Stack (EC2, S3, EBS, ) IT Process Compliance SDLC experience and formalized change controls Working in DevOps teams, based on Agile principles (e.g. Scrum) ITIL knowledge (especially incident, problem and change management) IT Process Compliance SDLC experience and formalized change controls Proficiency in PySpark for distributed computation Familiarity with Postgres and ElasticSearch Our Hyperlearning workplace is grounded upon four principles Flexible work arrangements, Free spirit, and emotional positivity Agile self-determination, trust, transparency, and open collaboration All Support needed for the realization of business goals, Stable employment with a great atmosphere and ethical corporate culture
Posted 3 weeks ago
3.0 - 8.0 years
50 - 55 Lacs
Bengaluru
Work from Office
Responsibilities Design, develop, and maintain various components of our cloud management platform, ensuring high performance and responsiveness. Work in collaboration with cross-functional teams to conceptualize, design, and deploy innovative features and functionalities that meet our business needs. Offer technical support and guidance to internal teams and stakeholders, helping to resolve complex issues. Keep abreast of the latest trends and technologies in the industry to incorporate best practices into our platform. Requirements 1.5 to 2 years of professional experience in developing applications or platforms using Python. Strong understanding of Object-Oriented Programming (OOP), SOLID principles, and Relational Database Management Systems (RDBMS). Proven experience with AWS services, such as Lambda, RDS, and DynamoDB, with a strong grasp of cloud computing concepts and architectural best practices. Experience in developing and integrating RESTful APIs. Experience with source control systems, such as Git. Exceptional problem-solving abilities, with a knack for debugging complex systems. Excellent communication skills, capable of effectively collaborating with team members and engaging with stakeholders. A relentless drive for learning and staying current with industry developments. Bachelor's degree in Computer Science, Engineering, or a related field, or equivalent professional experience. Nice to have AWS Certified Developer Associate or other relevant AWS certifications. Experience in serverless development is a significant plus, showcasing familiarity with building and deploying serverless applications. Experience with the AWS Boto3 SDK for Python. Exposure to other cloud platforms like Azure or GCP. Familiarity with containerization and orchestration technologies, such as Docker and Kubernetes.
Posted 3 weeks ago
3.0 - 5.0 years
5 - 7 Lacs
Kolkata, Mumbai, New Delhi
Work from Office
Description & Requirements Our development group supports Infor Nexus, the world s leading network for multi-enterprise supply chain orchestration. The network connects businesses to their entire supply chain from suppliers and manufacturers, to brokers, 3PLs, and banks paving the way for enhanced supply chain visibility, collaboration, and predictive intelligence. As a senior software engineer, you ll apply sound coding and design skills in java to build scaleable software working with big data, cloud platforms, open source, and large code bases. We are looking for an individual who can take ownership of and tackle coding problems head-on to devise intelligent solutions. We are a collaborative environment where everyone is willing to help and work as a team to resolve issues. Basic Qualifications: Looking for Premium Institutes Candidates Only Exp : 3- 5 Yrs Location : Bangalore Experience with software development Proficient in Java strongly preferred, other OO languages may be considered Strong OOD concept and OOP aptitudes Well versed in SQL and experience with RDBMS like SQL Server, Oracle. Excellent communication and interpersonal skills 9PM-1AM IST-- Work Timings ,Day hours will be discussed. Responsibilities: Developing, testing, and deploying Java code Solving for answers using SQL and relational databases Learning multiple cloud environments Storing code in large code bases Collaborating to find the best solutions
Posted 3 weeks ago
4.0 - 7.0 years
6 - 9 Lacs
Gurugram
Work from Office
Develop, test, and maintain ETL solutions using Informatica PowerCenter/IICS Work with business analysts and SMEs to understand requirements and build data integration workflows Analyze various data sources (RDBMS, CSV, Salesforce, etc.) Create design documents and suggest technical improvements Coordinate with onshore teams and ensure timely delivery Follow development best practices and ensure quality documentation Mandatory Skills: Strong hands-on experience in ETL development using Informatica PowerCenter/IICS Good knowledge of SQL/PL-SQL , Microsoft SQL Server , Oracle Familiarity with DevOps tools : Bitbucket, Jenkins, Git, etc. Exposure to job scheduling tools like Control-M or ZEKE Basic Unix/Windows scripting Understanding of SDLC and Agile methodologies Nice to Have: Experience with Cloud (AWS) , CAI , APIs Knowledge of the insurance domain Familiarity with Qlik Sense , Databricks Work Model & Timings: Shift: Must overlap with the onshore team till 11 AM EST Flexibility to adjust work hours occasionally [ Hybrid Docker, Containerization, Shell Scripting, Sql, Ci & Cd Pipelines, Bigbucket, Devops, Iics, Informatica
Posted 3 weeks ago
8.0 - 10.0 years
11 - 18 Lacs
Pune
Work from Office
Role Responsibilities : - Design and implement data pipelines using MS Fabric. - Develop data models to support business intelligence and analytics. - Manage and optimize ETL processes for data extraction, transformation, and loading. - Collaborate with cross-functional teams to gather and define data requirements. - Ensure data quality and integrity in all data processes. - Implement best practices for data management, storage, and processing. - Conduct performance tuning for data storage and retrieval for enhanced efficiency. - Generate and maintain documentation for data architecture and data flow. - Participate in troubleshooting data-related issues and implement solutions. - Monitor and optimize cloud-based solutions for scalability and resource efficiency. - Evaluate emerging technologies and tools for potential incorporation in projects. - Assist in designing data governance frameworks and policies. - Provide technical guidance and support to junior data engineers. - Participate in code reviews and ensure adherence to coding standards. - Stay updated with industry trends and best practices in data engineering. Qualifications : - 8+ years of experience in data engineering roles. - Strong expertise in MS Fabric and related technologies. - Proficiency in SQL and relational database management systems. - Experience with data warehousing solutions and data modeling. - Hands-on experience in ETL tools and processes. - Knowledge of cloud computing platforms (Azure, AWS, GCP). - Familiarity with Python or similar programming languages. - Ability to communicate complex concepts clearly to non-technical stakeholders. - Experience in implementing data quality measures and data governance. - Strong problem-solving skills and attention to detail. - Ability to work independently in a remote environment. - Experience with data visualization tools is a plus. - Excellent analytical and organizational skills. - Bachelor's degree in Computer Science, Engineering, or related field. - Experience in Agile methodologies and project management.
Posted 3 weeks ago
8.0 - 10.0 years
11 - 18 Lacs
Mumbai
Work from Office
Role Responsibilities : - Design and implement data pipelines using MS Fabric. - Develop data models to support business intelligence and analytics. - Manage and optimize ETL processes for data extraction, transformation, and loading. - Collaborate with cross-functional teams to gather and define data requirements. - Ensure data quality and integrity in all data processes. - Implement best practices for data management, storage, and processing. - Conduct performance tuning for data storage and retrieval for enhanced efficiency. - Generate and maintain documentation for data architecture and data flow. - Participate in troubleshooting data-related issues and implement solutions. - Monitor and optimize cloud-based solutions for scalability and resource efficiency. - Evaluate emerging technologies and tools for potential incorporation in projects. - Assist in designing data governance frameworks and policies. - Provide technical guidance and support to junior data engineers. - Participate in code reviews and ensure adherence to coding standards. - Stay updated with industry trends and best practices in data engineering. Qualifications : - 8+ years of experience in data engineering roles. - Strong expertise in MS Fabric and related technologies. - Proficiency in SQL and relational database management systems. - Experience with data warehousing solutions and data modeling. - Hands-on experience in ETL tools and processes. - Knowledge of cloud computing platforms (Azure, AWS, GCP). - Familiarity with Python or similar programming languages. - Ability to communicate complex concepts clearly to non-technical stakeholders. - Experience in implementing data quality measures and data governance. - Strong problem-solving skills and attention to detail. - Ability to work independently in a remote environment. - Experience with data visualization tools is a plus. - Excellent analytical and organizational skills. - Bachelor's degree in Computer Science, Engineering, or related field. - Experience in Agile methodologies and project management.
Posted 3 weeks ago
8.0 - 10.0 years
11 - 18 Lacs
Jaipur
Work from Office
Role Responsibilities : - Design and implement data pipelines using MS Fabric. - Develop data models to support business intelligence and analytics. - Manage and optimize ETL processes for data extraction, transformation, and loading. - Collaborate with cross-functional teams to gather and define data requirements. - Ensure data quality and integrity in all data processes. - Implement best practices for data management, storage, and processing. - Conduct performance tuning for data storage and retrieval for enhanced efficiency. - Generate and maintain documentation for data architecture and data flow. - Participate in troubleshooting data-related issues and implement solutions. - Monitor and optimize cloud-based solutions for scalability and resource efficiency. - Evaluate emerging technologies and tools for potential incorporation in projects. - Assist in designing data governance frameworks and policies. - Provide technical guidance and support to junior data engineers. - Participate in code reviews and ensure adherence to coding standards. - Stay updated with industry trends and best practices in data engineering. Qualifications : - 8+ years of experience in data engineering roles. - Strong expertise in MS Fabric and related technologies. - Proficiency in SQL and relational database management systems. - Experience with data warehousing solutions and data modeling. - Hands-on experience in ETL tools and processes. - Knowledge of cloud computing platforms (Azure, AWS, GCP). - Familiarity with Python or similar programming languages. - Ability to communicate complex concepts clearly to non-technical stakeholders. - Experience in implementing data quality measures and data governance. - Strong problem-solving skills and attention to detail. - Ability to work independently in a remote environment. - Experience with data visualization tools is a plus. - Excellent analytical and organizational skills. - Bachelor's degree in Computer Science, Engineering, or related field. - Experience in Agile methodologies and project management.
Posted 3 weeks ago
4.0 - 8.0 years
7 - 12 Lacs
Bengaluru
Work from Office
The Database Patch Engineer is responsible for planning, coordinating, testing, and applying patches and updates to database systems to ensure security, stability, and compliance with cyber security standards. This role requires strong technical knowledge of database systems, patching best practices, and advanced troubleshooting skills. Key Deliverables (Duties and Responsibilities) Plan and implement database patches (security patches, bug fixes, and upgrades) across multiple environments (development, testing, QA, and production). Collaborate with DBAs, application teams, and system administrators to schedule and execute patching activities with minimal downtime. Perform impact analysis and risk assessments prior to patch deployments. Test patches in lower environments to ensure compatibility and performance. Monitor patch releases from vendors (e.g., Oracle, Microsoft, DB2) and maintain patch management schedules. Maintain detailed documentation of patching procedures, schedules, and post-patch validations. Automate patching processes where feasible using scripting or patch management tools. Troubleshoot and resolve patch-related issues promptly. Ensure compliance with cybersecurity standards and audit requirements. Assist in database upgrades, migrations, and disaster recovery exercises. Provide post-patching reports and status updates to leadership Skills and Qualification (Functional and Technical Skills) Bachelor s degree in computer science, Information Technology, or a related field, or equivalent experience. 7+ years of experience in database administration and patch management. Strong expertise with major RDBMS platforms (Oracle, SQL Server, DB2, etc.). Experience with database patching tools (e.g., Oracle OPatch, Microsoft Windows Update, Red Hat Satellite). Proficiency in SQL, scripting (Shell, PowerShell, Python), and automation frameworks. Knowledge of ITIL processes and change management. Good understanding of high availability (HA) cluster, ASM (Automated Storage Management) and disaster recovery (DR) strategies. Excellent troubleshooting and problem-solving skills. Experience in working with multiple operating system environment - Windows Server, RHEL and AIX Strong communication and collaboration skills. Preferred Qualifications: Certifications such as Oracle Certified Professional (OCP), Microsoft Certified: Azure Database Administrator Associate, or equivalent. Experience with cloud-based databases (AWS RDS, Azure SQL, Oracle Cloud). Experience in DevOps environments and with Infrastructure as a Code practice Relationships & Collaboration Ensure great Partnership with IT counterparts (India as well as US associates). Be available for team connects if any. Flexible to adjust work scheduled including evening, weekend; as per Org need. Collaborate with Business wherever necessary.
Posted 3 weeks ago
3.0 - 6.0 years
4 - 8 Lacs
Noida
Work from Office
We are looking for and experienced Java Developer or Architect with strong technical expertise to design and lead development of scalable, high performance Java applications. The ideal candidate should have in depth understanding of Java/J2ee technologies, Design Pattern, Microservice Architecture, Docker & Kubernetes, Integration Framework. This role requires design skills, excellent problem-solving skills, and the ability to collaborate with cross-functional teams, including DevOps, and Front-End developers. What you will do: Architect, design, and implement back-end solutions using Java/J2ee, Spring MVC, Spring Boot and related frameworks. Design, develop and maintain scalable Java components using REST or SOAP based Web Services Design & develop enterprise solution with Messaging or Streaming Framework like ActiveMQ, HornetQ & Kafka Work with Integration Framework like Apache Camel/Jboss Fuse/Mule ESB/EAI/Spring Integration. Make effective use of Caching Technologies (like Hazlecast /Redis /Infinispan /EHCache /MemCache) in application to handle large volume of data set. Deploy the application in Middleware or App Server (like Jboss/Weblogic/tomcat) Collaborate with the DevOps team to manage builds and CI/CD pipelines using Jira, GitLab, Sonar and other tools. The skills you bring: Strong expertise in Java/J2ee, Springboot & Micriservices. Good understanding of Core Java concepts (like Collections Framework, Object Oriented Design) Experience in working with Multithreading Concepts (like Thread Pool, Executor Service, Future Task, Concurrent API, Countdown Latch) Detailed working exposure of Java8 with Stream API, Lambda, Interface, Functional Interfaces Proficiency in Java Web Application Development using Spring MVC & Spring Boot Good Knowledge about using Data Access Frameworks using ORM (Hibernate & JPA) Familiar with Database concepts with knowledge in RDBMS/SQL Good understanding of Monolithic & Microservice Architecture
Posted 3 weeks ago
2.0 - 6.0 years
4 - 8 Lacs
Ahmedabad
Work from Office
Responsible for testing of Raw material,IP/ FP and Stability sample as per method of analysis. Should have knowledge for the Method Verification/validation and Method Transfer analysis. Have an exposure on operation, calibration, Qualification and maintenance of laboratory instruments / equipments . Should have exposure on sophisticated Instruments i.e. HPLC, Dissolution,FTIR,UV, GC, Autotitrator,Karl fischer,PSD etc. Have an exposure for the preparation of calibration & PM schedule of laboratory instruments / equipments. Should have knowledge for the preparation, handling & management of working/reference standards etc. Testing of In-process, Stability and Finished product samples,Raw material. Should have awareness of Caliber LIMS system operation. Having exposure of Method transfer/verification /validation activity.
Posted 3 weeks ago
Upload Resume
Drag or click to upload
Your data is secure with us, protected by advanced encryption.
The job market for RDBMS (Relational Database Management System) professionals in India is thriving, with a high demand for skilled individuals who can design, implement, and manage relational databases. Companies across various industries are actively seeking RDBMS experts to maintain their data infrastructure and ensure efficient data management.
These cities are known for their vibrant tech scenes and offer numerous opportunities for RDBMS professionals.
The average salary range for RDBMS professionals in India varies based on experience and expertise. Entry-level positions can expect to earn around INR 3-5 lakhs per annum, while experienced professionals can command salaries ranging from INR 8-15 lakhs per annum.
A typical career path in RDBMS starts with roles such as Junior Database Administrator or Database Developer. As professionals gain experience and expertise, they can progress to roles like Senior Database Administrator, Data Architect, or Database Manager. Eventually, experienced individuals may advance to positions such as Tech Lead or Database Architect.
In addition to RDBMS expertise, professionals in this field are often expected to have knowledge of: - SQL programming - Database design principles - Data modeling - Performance tuning - Database security
As you explore opportunities in the RDBMS job market in India, remember to showcase your expertise in relational databases and related skills during interviews. Prepare thoroughly, demonstrate your knowledge confidently, and you'll be well-positioned to land a rewarding career in this field. Good luck!
Browse through a variety of job opportunities tailored to your skills and preferences. Filter by location, experience, salary, and more to find your perfect fit.
We have sent an OTP to your contact. Please enter it below to verify.