Get alerts for new jobs matching your selected skills, preferred locations, and experience range. Manage Job Alerts
4.0 years
0 Lacs
Hyderabad, Telangana, India
On-site
The Purview team is dedicated to protecting and governing the enterprise digital estate on a global scale. Our mission involves developing cloud solutions that offer premium features such as security, compliance, data governance, data loss prevention and insider risk management. These solutions are fully integrated across Office 365 services and clients, as well as Windows. We create global-scale services to transport, store, secure, and manage some of the most sensitive data on the planet, leveraging Azure, Exchange, and other cloud platforms, along with Office applications like Outlook. The IDC arm of our team is expanding significantly and seeks talented, highly motivated engineers. This is an excellent opportunity for those looking to build expertise in cloud distributed systems, security, and compliance. Our team will develop cloud solutions that meet the demands of a vast user base, utilizing state-of-the-art technologies to deliver comprehensive protection. Office 365, the industry leader in hosted productivity suites, is the fastest-growing business at Microsoft, with over 100 million seats hosted in multiple data centers worldwide. The Purview Engineering team provides leadership, direction, and accountability for application architecture, cloud design, infrastructure development, and end-to-end implementation. You will independently determine and develop architectural approaches and infrastructure solutions, conduct business reviews, and operate our production services. Strong collaboration skills are essential to work closely with other engineering teams, ensuring our services and systems are highly stable, performant, and meet the expectations of both internal and external customers and users. Microsoft’s mission is to empower every person and every organization on the planet to achieve more. As employees we come together with a growth mindset, innovate to empower others, and collaborate to realize our shared goals. Each day we build on our values of respect, integrity, and accountability to create a culture of inclusion where everyone can thrive at work and beyond. Responsibilities Build cloud-scale services that process and analyze massive volumes of organizational signals in real time. Harness the power of Apache Spark for high-performance data processing and scalable pipelines. Apply machine learning to uncover subtle patterns and anomalies that signal insider threats. Craft intelligent user experiences using React and AI-driven insights to help security analysts act with confidence. Work with a modern tech stack and contribute to a product that’s mission-critical for some of the world’s largest organizations. Collaborate across disciplines—from data science to UX to cloud infrastructure—in a fast-paced, high-impact environment. Design and deliver end-to-end features including system architecture, coding, deployment, scalability, performance, and quality. Develop large-scale distributed software services and solutions that are modular, secure, reliable, diagnosable, and reusable. Conduct investigations and drive investments in complex technical areas to improve systems and services. Ensure engineering excellence by writing effective code, unit tests, debugging, code reviews, and building CI/CD pipelines. Troubleshoot and optimize Live Site operations, focusing on automation, reliability, and monitoring. Qualifications Qualifications - Required: Solid understanding of Object-Oriented Programming (OOP) and common Design Patterns. Minimum of 4+ years of software development experience, with proficiency in C#, Java, or scala. Hands-on experience with cloud platforms such as Azure, AWS, or Google Cloud; experience with Azure Services is a plus. Familiarity with DevOps practices, CI/CD pipelines, and agile methodologies. Strong skills in distributed systems and data processing. Excellent communication and collaboration abilities, with the capacity to handle ambiguity and prioritize effectively. A BS or MS degree in Computer Science or Engineering, or equivalent work experience. Qualifications - Other Requirements Ability to meet Microsoft, customer and/or government security screening requirements are required for this role. These requirements include, but are not limited to the following specialized security screenings: Microsoft Cloud Background Check: This position will be required to pass the Microsoft background and Microsoft Cloud background check upon hire/transfer and every two years thereafter. Microsoft is an equal opportunity employer. Consistent with applicable law, all qualified applicants will receive consideration for employment without regard to age, ancestry, citizenship, color, family or medical care leave, gender identity or expression, genetic information, immigration status, marital status, medical condition, national origin, physical or mental disability, political affiliation, protected veteran or military status, race, ethnicity, religion, sex (including pregnancy), sexual orientation, or any other characteristic protected by applicable local laws, regulations and ordinances. If you need assistance and/or a reasonable accommodation due to a disability during the application process, read more about requesting accommodations.
Posted 1 week ago
0.0 - 4.0 years
0 - 0 Lacs
Sahibzada Ajit Singh Nagar, Mohali, Punjab
On-site
Job Title: Python Backend Developer (Data Layer) Location: Mohali, Punjab Company: RevClerx About RevClerx: RevClerx Pvt. Ltd., founded in 2017 and based in the Chandigarh/Mohali area (India), is a dynamic Information Technology firm providing comprehensive IT services with a strong focus on client-centric solutions. As a global provider, we cater to diverse business needs including website designing and development, digital marketing, lead generation services (including telemarketing and qualification), and appointment setting. Job Summary: We are seeking a skilled Python Backend Developer with a strong passion and proven expertise in database design and implementation. This role requires 3-4 years of backend development experience, focusing on building robust, scalable applications and APIs. The ideal candidate will not only be proficient in Python and common backend frameworks but will possess significant experience in designing, modeling, and optimizing various database solutions, including relational databases (like PostgreSQL) and, crucially, graph databases (specifically Neo4j). You will play a vital role in architecting the data layer of our applications, ensuring efficiency, scalability, and the ability to handle complex, interconnected data. Key Responsibilities: ● Design, develop, test, deploy, and maintain scalable and performant Python-based backend services and APIs. ● Lead the design and implementation of database schemas for relational (e.g., PostgreSQL) and NoSQL databases, with a strong emphasis on Graph Databases (Neo4j). ● Model complex data relationships and structures effectively, particularly leveraging graph data modeling principles where appropriate. ● Write efficient, optimized database queries (SQL, Cypher, potentially others). ● Develop and maintain data models, ensuring data integrity, consistency, and security. ● Optimize database performance through indexing strategies, query tuning, caching mechanisms, and schema adjustments. ● Collaborate closely with product managers, frontend developers, and other stakeholders to understand data requirements and translate them into effective database designs. ● Implement data migration strategies and scripts as needed. ● Integrate various databases seamlessly with Python backend services using ORMs (like SQLAlchemy, Django ORM) or native drivers. ● Write unit and integration tests, particularly focusing on data access and manipulation logic. ● Contribute to architectural decisions, especially concerning data storage, retrieval, and processing. ● Stay current with best practices in database technologies, Python development, and backend systems. Minimum Qualifications: ● Bachelor's degree in Computer Science, Engineering, Information Technology, or a related field, OR equivalent practical experience. ● 3-4 years of professional software development experience with a primary focus on Python backend development. ● Strong proficiency in Python and its standard libraries. ● Proven experience with at least one major Python web framework (e.g., Django, Flask, FastAPI). ● Demonstrable, hands-on experience designing, implementing, and managing relational databases (e.g., PostgreSQL). ● Experience with at least one NoSQL database (e.g., MongoDB, Redis, Cassandra). ● Solid understanding of data structures, algorithms, and object-oriented programming principles. ● Experience designing and consuming RESTful APIs. ● Proficiency with version control systems, particularly Git. ● Strong analytical and problem-solving skills, especially concerning data modeling and querying. ● Excellent communication and teamwork abilities. Preferred (Good-to-Have) Qualifications: ● Graph Database Expertise: ○ Significant, demonstrable experience designing and implementing solutions using Graph Databases (Neo4j strongly preferred). ○ Proficiency in graph query languages, particularly Cypher. ○ Strong understanding of graph data modeling principles, use cases (e.g., recommendation engines, fraud detection, knowledge graphs, network analysis), and trade-offs. ● Advanced Database Skills: ○ Experience with database performance tuning and monitoring tools. ○ Experience with Object-Relational Mappers (ORMs) like SQLAlchemy or Django ORM in depth. ○ Experience implementing data migration strategies for large datasets. ● Cloud Experience: Familiarity with cloud platforms (e.g., AWS, Azure, Google Cloud Platform) and their managed database services (e.g., RDS, Aurora, Neptune, DocumentDB, MemoryStore). ● Containerization & Orchestration: Experience with Docker and Kubernetes. ● Asynchronous Programming: Experience with Python's asyncio and async frameworks. ● Data Pipelines: Familiarity with ETL processes or data pipeline tools (e.g., Apache Airflow). ● Testing: Experience writing tests specifically for database interactions and data integrity. What We Offer: ● Challenging projects with opportunities to work on cutting-edge technologies especially in the field of AI. ● Competitive salary and comprehensive benefits package. ● Opportunities for professional development and learning (e.g., conferences, courses, certifications). ● A collaborative, innovative, and supportive work environment. How to Apply: Interested candidates are invited to submit their resume and a cover letter outlining their relevant experience, specifically highlighting their database design expertise (including relational, NoSQL, and especially Graph DB/Neo4j experience) Job Types: Full-time, Permanent Pay: ₹30,000.00 - ₹55,373.94 per month Benefits: Food provided Health insurance Schedule: Day shift Monday to Friday
Posted 1 week ago
4.0 years
0 Lacs
Kanayannur, Kerala, India
On-site
At EY, you’ll have the chance to build a career as unique as you are, with the global scale, support, inclusive culture and technology to become the best version of you. And we’re counting on your unique voice and perspective to help EY become even better, too. Join us and build an exceptional experience for yourself, and a better working world for all. EY GDS – Data and Analytics (D&A) – Data Engineer (Python) As part of our EY-GDS D&A (Data and Analytics) team, we help our clients solve complex business challenges with the help of data and technology. We dive deep into data to extract the greatest value and discover opportunities in key business and functions like Banking, Insurance, Manufacturing, Healthcare, Retail, Manufacturing and Auto, Supply Chain, and Finance. The opportunity We are currently seeking a seasoned Data Engineer with a good experience in Python to join our team of professionals. Key Responsibilities: Develop Data Lake tables leveraging AWS Glue and Spark for efficient data management. Implement data pipelines using Airflow, Kubernetes, and various AWS services Must Have Skills: Experience in deploying and managing data warehouses Advanced proficiency of at least 4 years in Python for data analysis and organization Solid understanding of AWS cloud services Proficient in using Apache Spark for large-scale data processing Skills and Qualifications Needed: Practical experience with Apache Airflow for workflow orchestration Demonstrated ability in designing, building, and optimizing ETL processes, data pipelines, and data architectures Flexible, self-motivated approach with strong commitment to problem resolution. Excellent written and oral communication skills, with the ability to deliver complex information in a clear and effective manner to a range of different audiences. Willingness to work globally and across different cultures, and to participate in all stages of the data solution delivery lifecycle, including pre-studies, design, development, testing, deployment, and support. Nice to have exposure to Apache Druid Familiarity with relational database systems, Desired Work Experience : A degree in computer science or a similar field What Working At EY Offers At EY, we’re dedicated to helping our clients, from start–ups to Fortune 500 companies — and the work we do with them is as varied as they are. You get to work with inspiring and meaningful projects. Our focus is education and coaching alongside practical experience to ensure your personal development. We value our employees and you will be able to control your own development with an individual progression plan. You will quickly grow into a responsible role with challenging and stimulating assignments. Moreover, you will be part of an interdisciplinary environment that emphasizes high quality and knowledge exchange. Plus, we offer: Support, coaching and feedback from some of the most engaging colleagues around Opportunities to develop new skills and progress your career The freedom and flexibility to handle your role in a way that’s right for you EY | Building a better working world EY exists to build a better working world, helping to create long-term value for clients, people and society and build trust in the capital markets. Enabled by data and technology, diverse EY teams in over 150 countries provide trust through assurance and help clients grow, transform and operate. Working across assurance, consulting, law, strategy, tax and transactions, EY teams ask better questions to find new answers for the complex issues facing our world today.
Posted 1 week ago
10.0 years
0 Lacs
Kanayannur, Kerala, India
On-site
At EY, you’ll have the chance to build a career as unique as you are, with the global scale, support, inclusive culture and technology to become the best version of you. And we’re counting on your unique voice and perspective to help EY become even better, too. Join us and build an exceptional experience for yourself, and a better working world for all. EY GDS – Data and Analytics (D&A) – Cloud Architect - Manager As part of our EY-GDS D&A (Data and Analytics) team, we help our clients solve complex business challenges with the help of data and technology. We dive deep into data to extract the greatest value and discover opportunities in key business and functions like Banking, Insurance, Manufacturing, Healthcare, Retail, Manufacturing and Auto, Supply Chain, and Finance. The opportunity We’re looking for SeniorManagers (GTM +Cloud/ Big Data Architects) with strong technology and data understanding having proven delivery capability in delivery and pre sales. This is a fantastic opportunity to be part of a leading firm as well as a part of a growing Data and Analytics team. Your Key Responsibilities Have proven experience in driving Analytics GTM/Pre-Sales by collaborating with senior stakeholder/s in the client and partner organization in BCM, WAM, Insurance. Activities will include pipeline building, RFP responses, creating new solutions and offerings, conducting workshops as well as managing in flight projects focused on cloud and big data. Need to work with client in converting business problems/challenges to technical solutions considering security, performance, scalability etc. [ 10- 15 years] Need to understand current & Future state enterprise architecture. Need to contribute in various technical streams during implementation of the project. Provide product and design level technical best practices Interact with senior client technology leaders, understand their business goals, create, architect, propose, develop and deliver technology solutions Define and develop client specific best practices around data management within a Hadoop environment or cloud environment Recommend design alternatives for data ingestion, processing and provisioning layers Design and develop data ingestion programs to process large data sets in Batch mode using HIVE, Pig and Sqoop, Spark Develop data ingestion programs to ingest real-time data from LIVE sources using Apache Kafka, Spark Streaming and related technologies Skills And Attributes For Success Architect in designing highly scalable solutions Azure, AWS and GCP. Strong understanding & familiarity with all Azure/AWS/GCP /Bigdata Ecosystem components Strong understanding of underlying Azure/AWS/GCP Architectural concepts and distributed computing paradigms Hands-on programming experience in Apache Spark using Python/Scala and Spark Streaming Hands on experience with major components like cloud ETLs,Spark, Databricks Experience working with NoSQL in at least one of the data stores - HBase, Cassandra, MongoDB Knowledge of Spark and Kafka integration with multiple Spark jobs to consume messages from multiple Kafka partitions Solid understanding of ETL methodologies in a multi-tiered stack, integrating with Big Data systems like Cloudera and Databricks. Strong understanding of underlying Hadoop Architectural concepts and distributed computing paradigms Experience working with NoSQL in at least one of the data stores - HBase, Cassandra, MongoDB Good knowledge in apache Kafka & Apache Flume Experience in Enterprise grade solution implementations. Experience in performance bench marking enterprise applications Experience in Data security [on the move, at rest] Strong UNIX operating system concepts and shell scripting knowledge To qualify for the role, you must have Flexible and proactive/self-motivated working style with strong personal ownership of problem resolution. Excellent communicator (written and verbal formal and informal). Ability to multi-task under pressure and work independently with minimal supervision. Strong verbal and written communication skills. Must be a team player and enjoy working in a cooperative and collaborative team environment. Adaptable to new technologies and standards. Participate in all aspects of Big Data solution delivery life cycle including analysis, design, development, testing, production deployment, and support Responsible for the evaluation of technical risks and map out mitigation strategies Experience in Data security [on the move, at rest] Experience in performance bench marking enterprise applications Working knowledge in any of the cloud platform, AWS or Azure or GCP Excellent business communication, Consulting, Quality process skills Excellent Consulting Skills Excellence in leading Solution Architecture, Design, Build and Execute for leading clients in Banking, Wealth Asset Management, or Insurance domain. Minimum 7 years hand-on experience in one or more of the above areas. Minimum 10 years industry experience Ideally, you’ll also have Strong project management skills Client management skills Solutioning skills What We Look For People with technical experience and enthusiasm to learn new things in this fast-moving environment What Working At EY Offers At EY, we’re dedicated to helping our clients, from start–ups to Fortune 500 companies — and the work we do with them is as varied as they are. You get to work with inspiring and meaningful projects. Our focus is education and coaching alongside practical experience to ensure your personal development. We value our employees and you will be able to control your own development with an individual progression plan. You will quickly grow into a responsible role with challenging and stimulating assignments. Moreover, you will be part of an interdisciplinary environment that emphasizes high quality and knowledge exchange. Plus, we offer: Support, coaching and feedback from some of the most engaging colleagues around Opportunities to develop new skills and progress your career The freedom and flexibility to handle your role in a way that’s right for you EY | Building a better working world EY exists to build a better working world, helping to create long-term value for clients, people and society and build trust in the capital markets. Enabled by data and technology, diverse EY teams in over 150 countries provide trust through assurance and help clients grow, transform and operate. Working across assurance, consulting, law, strategy, tax and transactions, EY teams ask better questions to find new answers for the complex issues facing our world today.
Posted 1 week ago
5.0 years
0 Lacs
India
Remote
Who We Are At Twilio, we’re shaping the future of communications, all from the comfort of our homes. We deliver innovative solutions to hundreds of thousands of businesses and empower millions of developers worldwide to craft personalized customer experiences. Our dedication to remote-first work, and strong culture of connection and global inclusion means that no matter your location, you’re part of a vibrant team with diverse experiences making a global impact each day. As we continue to revolutionize how the world interacts, we’re acquiring new skills and experiences that make work feel truly rewarding. Your career at Twilio is in your hands. See yourself at Twilio Join the team as our next Senior Machine Learning Engineer (L3) in our Comms Platform Engineering team About The Job This position is needed to scope, design, and deploy machine learning systems into the real world, the individual will closely partner with Product & Engineering teams to execute the roadmap for Twilio’s AI/ML products and services. Twilio is looking for a Senior Machine Learning engineer to join the rapidly growing Comms Platform Engineering team of our Messaging business unit. You will understand the needs of our customers and build data products that solve their needs at a global scale. Working side by side with other engineering teams and product counterparts, you will own end-to-end execution of ML solutions. To thrive in this role, you must have a background in ML engineering, and a track record of solving data & machine-learning problems at scale. You are a self-starter, embody a growth attitude, and collaborate effectively across the entire Twilio organization Responsibilities In this role, you’ll: Build and maintain scalable machine learning solutions in production Train and validate both deep learning-based and statistical-based models considering use-case, complexity, performance, and robustness Demonstrate end-to-end understanding of applications and develop a deep understanding of the “why” behind our models & systems Partner with product managers, tech leads, and stakeholders to analyze business problems, clarify requirements and define the scope of the systems needed Work closely with data platform teams to build robust scalable batch and realtime data pipelines Work closely with software engineers, build tools to enhance productivity and to ship and maintain ML models Drive engineering best practices around code reviews, automated testing and monitoring Qualifications Not all applicants will have skills that match a job description exactly. Twilio values diverse experiences in other industries, and we encourage everyone who meets the required qualifications to apply. While having “desired” qualifications make for a strong candidate, we encourage applicants with alternative experiences to also apply. If your career is just starting or hasn't followed a traditional path, don't let that stop you from considering Twilio. We are always looking for people who will bring something new to the table! Required 5+ years of applied ML experience. Proficiency in Python is preferred. We will also consider strong quantitative candidates with a background in other programming languages Strong background in the foundations of machine learning and building blocks of modern deep learning Track record of building, shipping and maintaining machine learning models in production in an ambiguous and fast paced environment. You have a clear understanding of frameworks like - PyTorch, TensorFlow, or Keras, why and how these frameworks do what they do Familiarity with ML Ops concepts related to testing and maintaining models in production such as testing, retraining, and monitoring. Demonstrated ability to ramp up, understand, and operate effectively in new application / business domains. You’ve explored some of the modern data storage, messaging, and processing tools (Kafka, Apache Spark, Hadoop, Presto, DynamoDB etc.) Experience working in an agile team environment with changing priorities Experience of working on AWS Desired Experience with Large Language Models Location This role will be remote, and based in India (only in Karnataka, TamilNadu, Maharashtra, Telangana and New Delhi). Travel We prioritize connection and opportunities to build relationships with our customers and each other. For this role, you may be required to travel occasionally to participate in project or team in-person meetings. What We Offer Working at Twilio offers many benefits, including competitive pay, generous time off, ample parental and wellness leave, healthcare, a retirement savings program, and much more. Offerings vary by location. Twilio thinks big. Do you? We like to solve problems, take initiative, pitch in when needed, and are always up for trying new things. That's why we seek out colleagues who embody our values — something we call Twilio Magic. Additionally, we empower employees to build positive change in their communities by supporting their volunteering and donation efforts. So, if you're ready to unleash your full potential, do your best work, and be the best version of yourself, apply now! If this role isn't what you're looking for, please consider other open positions. Twilio is proud to be an equal opportunity employer. We do not discriminate based upon race, religion, color, national origin, sex (including pregnancy, childbirth, reproductive health decisions, or related medical conditions), sexual orientation, gender identity, gender expression, age, status as a protected veteran, status as an individual with a disability, genetic information, political views or activity, or other applicable legally protected characteristics. We also consider qualified applicants with criminal histories, consistent with applicable federal, state and local law. Qualified applicants with arrest or conviction records will be considered for employment in accordance with the Los Angeles County Fair Chance Ordinance for Employers and the California Fair Chance Act. Additionally, Twilio participates in the E-Verify program in certain locations, as required by law.
Posted 1 week ago
10.0 years
0 Lacs
Trivandrum, Kerala, India
On-site
At EY, you’ll have the chance to build a career as unique as you are, with the global scale, support, inclusive culture and technology to become the best version of you. And we’re counting on your unique voice and perspective to help EY become even better, too. Join us and build an exceptional experience for yourself, and a better working world for all. EY GDS – Data and Analytics (D&A) – Cloud Architect - Manager As part of our EY-GDS D&A (Data and Analytics) team, we help our clients solve complex business challenges with the help of data and technology. We dive deep into data to extract the greatest value and discover opportunities in key business and functions like Banking, Insurance, Manufacturing, Healthcare, Retail, Manufacturing and Auto, Supply Chain, and Finance. The opportunity We’re looking for SeniorManagers (GTM +Cloud/ Big Data Architects) with strong technology and data understanding having proven delivery capability in delivery and pre sales. This is a fantastic opportunity to be part of a leading firm as well as a part of a growing Data and Analytics team. Your Key Responsibilities Have proven experience in driving Analytics GTM/Pre-Sales by collaborating with senior stakeholder/s in the client and partner organization in BCM, WAM, Insurance. Activities will include pipeline building, RFP responses, creating new solutions and offerings, conducting workshops as well as managing in flight projects focused on cloud and big data. Need to work with client in converting business problems/challenges to technical solutions considering security, performance, scalability etc. [ 10- 15 years] Need to understand current & Future state enterprise architecture. Need to contribute in various technical streams during implementation of the project. Provide product and design level technical best practices Interact with senior client technology leaders, understand their business goals, create, architect, propose, develop and deliver technology solutions Define and develop client specific best practices around data management within a Hadoop environment or cloud environment Recommend design alternatives for data ingestion, processing and provisioning layers Design and develop data ingestion programs to process large data sets in Batch mode using HIVE, Pig and Sqoop, Spark Develop data ingestion programs to ingest real-time data from LIVE sources using Apache Kafka, Spark Streaming and related technologies Skills And Attributes For Success Architect in designing highly scalable solutions Azure, AWS and GCP. Strong understanding & familiarity with all Azure/AWS/GCP /Bigdata Ecosystem components Strong understanding of underlying Azure/AWS/GCP Architectural concepts and distributed computing paradigms Hands-on programming experience in Apache Spark using Python/Scala and Spark Streaming Hands on experience with major components like cloud ETLs,Spark, Databricks Experience working with NoSQL in at least one of the data stores - HBase, Cassandra, MongoDB Knowledge of Spark and Kafka integration with multiple Spark jobs to consume messages from multiple Kafka partitions Solid understanding of ETL methodologies in a multi-tiered stack, integrating with Big Data systems like Cloudera and Databricks. Strong understanding of underlying Hadoop Architectural concepts and distributed computing paradigms Experience working with NoSQL in at least one of the data stores - HBase, Cassandra, MongoDB Good knowledge in apache Kafka & Apache Flume Experience in Enterprise grade solution implementations. Experience in performance bench marking enterprise applications Experience in Data security [on the move, at rest] Strong UNIX operating system concepts and shell scripting knowledge To qualify for the role, you must have Flexible and proactive/self-motivated working style with strong personal ownership of problem resolution. Excellent communicator (written and verbal formal and informal). Ability to multi-task under pressure and work independently with minimal supervision. Strong verbal and written communication skills. Must be a team player and enjoy working in a cooperative and collaborative team environment. Adaptable to new technologies and standards. Participate in all aspects of Big Data solution delivery life cycle including analysis, design, development, testing, production deployment, and support Responsible for the evaluation of technical risks and map out mitigation strategies Experience in Data security [on the move, at rest] Experience in performance bench marking enterprise applications Working knowledge in any of the cloud platform, AWS or Azure or GCP Excellent business communication, Consulting, Quality process skills Excellent Consulting Skills Excellence in leading Solution Architecture, Design, Build and Execute for leading clients in Banking, Wealth Asset Management, or Insurance domain. Minimum 7 years hand-on experience in one or more of the above areas. Minimum 10 years industry experience Ideally, you’ll also have Strong project management skills Client management skills Solutioning skills What We Look For People with technical experience and enthusiasm to learn new things in this fast-moving environment What Working At EY Offers At EY, we’re dedicated to helping our clients, from start–ups to Fortune 500 companies — and the work we do with them is as varied as they are. You get to work with inspiring and meaningful projects. Our focus is education and coaching alongside practical experience to ensure your personal development. We value our employees and you will be able to control your own development with an individual progression plan. You will quickly grow into a responsible role with challenging and stimulating assignments. Moreover, you will be part of an interdisciplinary environment that emphasizes high quality and knowledge exchange. Plus, we offer: Support, coaching and feedback from some of the most engaging colleagues around Opportunities to develop new skills and progress your career The freedom and flexibility to handle your role in a way that’s right for you EY | Building a better working world EY exists to build a better working world, helping to create long-term value for clients, people and society and build trust in the capital markets. Enabled by data and technology, diverse EY teams in over 150 countries provide trust through assurance and help clients grow, transform and operate. Working across assurance, consulting, law, strategy, tax and transactions, EY teams ask better questions to find new answers for the complex issues facing our world today.
Posted 1 week ago
0 years
0 Lacs
Noida, Uttar Pradesh, India
On-site
Responsibilities · Experience with Spring Boot · Experience with Microservices development · Extensive Experience working with JAVA Rest API. · Extensive experience in Java 8-17 SE · Experience with unit testing frameworks Junit or Mockito · Experience with Maven/Gradle · Experience in Angular 16+ & Rxjs. NgRx is mandatory along with experience with unit testing using Jest/Jasmine · Professional, precise communication skills · Experience in API designing, troubleshooting, and tuning for performance · Professional, precise communication skills · Experience in designing, troubleshooting, API Java services and microservices · Experience in any CI/CD tool. · Experience in Apache Kafka will be added advantage. Qualifications we seek in you! Minimum Qualifications · BE /B.Tech/M.Tech/MCA · Excellent Communication Skills · Good Team Player Preferred qualifications · Experience with Spring Boot · Experience with Microservices development · Extensive Experience working with JAVA Rest API. · Extensive experience in Java 8-17 SE · Experience with unit testing frameworks Junit or Mockito · Experience with Maven/Gradle Experience in Angular 13+ & Rxjs. Ngrx will be an added advantage
Posted 1 week ago
6.0 years
0 Lacs
Pune, Maharashtra, India
On-site
Java Full stack developer Minimum Experience: 6+ years Location - Pune Required Skills: Strong proficiency in ReactJS and TypeScript Strong expertise in Java , Spring Framework , Spring Boot , and RESTful APIs Experience with PostgreSQL and Apache Kafka development Solid understanding of CI/CD pipelines , with hands-on experience using tools such as Chef , Jenkins , SonarQube , Checkmarx , Maven , and Gradle Proficient in Low-Level System Design Skilled in code reviews and maintaining code quality Ability to independently handle complex and challenging tasks Excellent communication skills and team collaboration Why Work at Apexon? We care about your growth, health, and happiness. Here are some perks you'll enjoy: Health Insurance (covers you and your family) Paid Leaves and Holidays Hybrid Work Culture Career Development & Learning Programs Wellness Support Programs Rapidly Growing Company Among the fastest-growing digital engineering firms Tech-Forward - Cutting-edge work in AI, ML, automation, and cloud Extensive learning and upskilling opportunities Award-Winning Workplace Festivals, milestones & team celebrations Hackathons, wellness activities, R&R, employee spotlights About Apexon Apexon is a digital-first technology company helping businesses grow through innovation and smarter digital solutions. We work with clients at every step of their digital journey using tools like AI, data, cloud, apps, and user experience design to create powerful digital products. Visit- www.apexon.com | https://www.linkedin.com/company/apexon/
Posted 1 week ago
10.0 - 14.0 years
20 - 30 Lacs
Noida, Pune, Bengaluru
Hybrid
Greetings from Infogain! We are having Immediate requirement for Big Data Engineer (Lead) position in Infogain India Pvt ltd. As a Big Data Engineer (Lead), you will be responsible for leading a team of big data engineers. You will work closely with clients and team members to understand their requirements and develop architectures that meet their needs. You will also be responsible for providing technical leadership and guidance to your team. Mode of Hiring-Permanent Skills : (Azure OR AWS) AND Apache Spark OR Hive OR Hadoop AND Spark Streaming OR Apache Flink OR Kafka AND NoSQL AND Shell OR Python. Exp: 10 to 14 years Location: Bangalore/Noida/Gurgaon/Pune/Mumbai/Kochi Notice period- Early joiner Educational Qualification: BE/BTech/MCA/M.tech Working Experience 12-15 years of broad experience of working with Enterprise IT applications in cloud platform and big data environments. Competencies & Personal Traits Work as a team player Excellent problem analysis skills Experience with at least one Cloud Infra provider (Azure/AWS) Experience in building data pipelines using batch processing with Apache Spark (Spark SQL, Dataframe API) or Hive query language (HQL) Experience in building streaming data pipeline using Apache Spark Structured Streaming or Apache Flink on Kafka & Delta Lake Knowledge of NOSQL databases. Good to have experience in Cosmos DB, Restful APIs and GraphQL Knowledge of Big data ETL processing tools, Data modelling and Data mapping. Experience with Hive and Hadoop file formats (Avro / Parquet / ORC) Basic knowledge of scripting (shell / bash) Experience of working with multiple data sources including relational databases (SQL Server / Oracle / DB2 / Netezza), NoSQL / document databases, flat files Basic understanding of CI CD tools such as Jenkins, JIRA, Bitbucket, Artifactory, Bamboo and Azure Dev-ops. Basic understanding of DevOps practices using Git version control Ability to debug, fine tune and optimize large scale data processing jobs Can share CV @ arti.sharma@infogain.com Total Exp Experience- Relevant Experience in Big data Relevant Exp in AWS OR Azure Cloud- Current CTC- Exp CTC- Current location - Ok for Bangalore location-
Posted 1 week ago
6.0 years
0 Lacs
Sanganer, Rajasthan, India
On-site
Unlock yourself. Take your career to the next level. At Atrium, we live and deliver at the intersection of industry strategy, intelligent platforms, and data science — empowering our customers to maximize the power of their data to solve their most complex challenges. We have a unique understanding of the role data plays in the world today and serve as market leaders in intelligent solutions. Our data-driven, industry-specific approach to business transformation for our customers places us uniquely in the market. Who are you? You are smart, collaborative, and take ownership to get things done. You love to learn and are intellectually curious in business and technology tools, platforms, and languages. You are energized by solving complex problems and bored when you don’t have something to do. You love working in teams and are passionate about pulling your weight to make sure the team succeeds. What will you be doing at Atrium? In this role, you will join the best and brightest in the industry to skillfully push the boundaries of what’s possible. You will work with customers to make smarter decisions through innovative problem-solving using data engineering, Analytics, and systems of intelligence. You will partner to advise, implement, and optimize solutions through industry expertise, leading cloud platforms, and data engineering. As a Snowflake Data Engineering Lead , you will be responsible for expanding and optimizing the data and data pipeline architecture, as well as optimizing data flow and collection for cross-functional teams. You will support the software developers, database architects, data analysts, and data scientists on data initiatives and will ensure optimal data delivery architecture is consistent throughout ongoing projects. In This Role, You Will Lead the design and architecture of end-to-end data warehousing and data lake solutions, focusing on the Snowflake platform, incorporating best practices for scalability, performance, security, and cost optimization Assemble large, complex data sets that meet functional / non-functional business requirements Identify, design, and implement internal process improvements: automating manual processes, optimizing data delivery, re-designing infrastructure for greater scalability, etc. Lead and mentor both onshore and offshore development teams, creating a collaborative environment Build the infrastructure required for optimal extraction, transformation, and loading of data from a wide variety of data sources using SQL, DBT, Python, AWS, and Big Data tools Development of ELT processes to ensure timely delivery of required data for customers Implementation of Data Quality measures to ensure accuracy, consistency, and integrity of data Design, implement, and maintain data models that can support the organization's data storage and analysis needs Deliver technical and functional specifications to support data governance and knowledge sharing In This Role, You Will Have Bachelor's degree in Computer Science, Software Engineering, or equivalent combination of relevant work experience and education 6+ years of experience delivering consulting services to medium and large enterprises. Implementations must have included a combination of the following experiences: Data Warehousing or Big Data consulting for mid-to-large-sized organizations 3+ years of experience specifically with Snowflake, demonstrating deep expertise in its core features and advanced capabilities Strong analytical skills with a thorough understanding of how to interpret customer business needs and translate those into a data architecture SnowPro Core certification is highly desired Hands-on experience with Python (Pandas, Dataframes, Functions) Strong proficiency in SQL (Stored Procedures, functions), including debugging, performance optimization, and database design Strong Experience with Apache Airflow and API integrations Solid experience in any one of the ETL/ELT tools (DBT, Coalesce, Wherescape, Mulesoft, Matillion, Talend, Informatica, SAP BODS, DataStage, Dell Boomi, etc.) Nice to have: Experience in Docker, DBT, data replication tools (SLT, Fivetran, Airbyte, HVR, Qlik, etc), Shell Scripting, Linux commands, AWS S3, or Big data technologies Strong project management, problem-solving, and troubleshooting skills with the ability to exercise mature judgment Enthusiastic, professional, and confident team player with a strong focus on customer success who can present effectively even under adverse conditions Strong presentation and communication skills Next Steps Our recruitment process is highly personalized. Some candidates complete the hiring process in one week, others may take longer, as it’s important we find the right position for you. It's all about timing and can be a journey as we continue to learn about one another. We want to get to know you and encourage you to be selective - after all, deciding to join a company is a big decision! At Atrium, we believe a diverse workforce allows us to match our growth ambitions and drive inclusion across the business. We are an equal opportunity employe,r and all qualified applicants will receive consideration for employment.
Posted 1 week ago
2.0 - 15.0 years
0 Lacs
Mysore, Karnataka, India
On-site
Job Responsibilities Conduct classroom training / virtual training Develop teaching materials including exercises & assignments Design assessments for various proficiency levels in each competency Enhance course material & course delivery based on feedback to improve training effectiveness Gather feedback from stakeholders, identify actions based on feedback and implement changes Program Management and Governance Location: Mysore, Bangalore Description of the Profile We are looking for trainers with 2 to15 years of teaching or IT experience and technology know-how in one or more of the following areas: Java – Java programming, Spring, Spring Boot, Angular / React, Bootstrap Open source – Python, PHP, Unix / Linux, MySQL, Apache, HTML5, CSS3, JavaScript Data Science – Python for data science, Machine learning, Exploratory data analysis, Statistics & Probability Big Data – Python programming, Hadoop, Spark, Scala, Mongo DB, NoSQL Microsoft – C# programming, SQL Server, ADO.NET, ASP.NET, MVC design pattern, Azure, SharePoint etc. MEAN / MERN stacks SAP – SAP ABAP programming / SAP MM / SAP SD /SAP BI / SAP S4 HANA Oracle – Oracle E-Business Suite (EBS) / PeopleSoft / Siebel CRM / Oracle Cloud / OBIEE / Fusion Middleware Cloud & Infrastructure Management – Network administration / Database administration / Windows administration / Linux administration / Middleware administration / End User Computing / ServiceNow, Cloud platforms like AWS / GCP/ Azure / Oracle Cloud, Virtualization DBMS – Oracle / SQL Server / MySQL / DB2 / NoSQL Testing – Selenium, Microfocus - UFT, Microfocus-ALM tools, SOA testing, SOAPUI, Rest assured, Appium API and integration – API, Microservices, TIBCO, APIGee, Mule Digital Commerce – SalesForce, Adobe Experience Manager Digital Process Automation - PEGA, Appian, Camunda, Unqork, UIPath Training-related experience Must have Teaching experience : conducting training sessions in classroom and dynamically responding to different capabilities of learners; experience in analyzing the feedback from sessions and identifying action areas for self-improvement Developing teaching material : Experience in gathering training needs, identifying learning objectives and designing training curriculum; experience in developing teaching material, including exercises and assignments Good presentation skills, excellent oral / written communication skills Nice to have Teaching experience : Experience in delivering session over virtual classrooms Program managing training : Practical experience in addressing organizational training needs by leading a team of educators; set goals, monitor progress, evaluate performance, and communicate to stakeholders Instructional Design: Developing engaging content Designing Assessments: Experience in designing assessments to evaluate the effectiveness of training and gauging the proficiency of the learner Participated in activities of the software development lifecycle like development, testing, configuration management and roll-out Educational Qualification & Experience Must have Bachelor’s / Master’s degree in Engineering or Master’s degree in Science / Computer Applications with consistently good academic record 2 to 15 years of relevant experience in training Nice to have Technology certification from any major certifying authorities like Microsoft, Oracle, Google, Amazon, Scrum, etc. Certification in teaching or eLearning content development
Posted 1 week ago
2.0 - 15.0 years
0 Lacs
Mysore, Karnataka, India
On-site
Job Responsibilities Conduct classroom training / virtual training Develop teaching materials including exercises & assignments Design assessments for various proficiency levels in each competency Enhance course material & course delivery based on feedback to improve training effectiveness Gather feedback from stakeholders, identify actions based on feedback and implement changes Program Management and Governance Location: Mysore, Bangalore Description of the Profile We are looking for trainers with 2 to15 years of teaching or IT experience and technology know-how in one or more of the following areas: Java – Java programming, Spring, Spring Boot, Angular / React, Bootstrap Open source – Python, PHP, Unix / Linux, MySQL, Apache, HTML5, CSS3, JavaScript Data Science – Python for data science, Machine learning, Exploratory data analysis, Statistics & Probability Big Data – Python programming, Hadoop, Spark, Scala, Mongo DB, NoSQL Microsoft – C# programming, SQL Server, ADO.NET, ASP.NET, MVC design pattern, Azure, SharePoint etc. MEAN / MERN stacks SAP – SAP ABAP programming / SAP MM / SAP SD /SAP BI / SAP S4 HANA Oracle – Oracle E-Business Suite (EBS) / PeopleSoft / Siebel CRM / Oracle Cloud / OBIEE / Fusion Middleware Cloud & Infrastructure Management – Network administration / Database administration / Windows administration / Linux administration / Middleware administration / End User Computing / ServiceNow, Cloud platforms like AWS / GCP/ Azure / Oracle Cloud, Virtualization DBMS – Oracle / SQL Server / MySQL / DB2 / NoSQL Testing – Selenium, Microfocus - UFT, Microfocus-ALM tools, SOA testing, SOAPUI, Rest assured, Appium API and integration – API, Microservices, TIBCO, APIGee, Mule Digital Commerce – SalesForce, Adobe Experience Manager Digital Process Automation - PEGA, Appian, Camunda, Unqork, UIPath Training-related experience Must have Teaching experience : conducting training sessions in classroom and dynamically responding to different capabilities of learners; experience in analyzing the feedback from sessions and identifying action areas for self-improvement Developing teaching material : Experience in gathering training needs, identifying learning objectives and designing training curriculum; experience in developing teaching material, including exercises and assignments Good presentation skills, excellent oral / written communication skills Nice to have Teaching experience : Experience in delivering session over virtual classrooms Program managing training : Practical experience in addressing organizational training needs by leading a team of educators; set goals, monitor progress, evaluate performance, and communicate to stakeholders Instructional Design: Developing engaging content Designing Assessments: Experience in designing assessments to evaluate the effectiveness of training and gauging the proficiency of the learner Participated in activities of the software development lifecycle like development, testing, configuration management and roll-out Educational Qualification & Experience Must have Bachelor’s / Master’s degree in Engineering or Master’s degree in Science / Computer Applications with consistently good academic record 2 to 15 years of relevant experience in training Nice to have Technology certification from any major certifying authorities like Microsoft, Oracle, Google, Amazon, Scrum, etc. Certification in teaching or eLearning content development
Posted 1 week ago
4.0 years
0 Lacs
Pune, Maharashtra, India
On-site
JD - Data Engineer Pattern values data and the engineering required to take full advantage of it. As a Data Engineer at Pattern, you will be working on business problems that have a huge impact on how the company maintains its competitive edge. Essential Duties And Responsibilities Develop, deploy, and support real-time, automated, scalable data streams from a variety of sources into the data lake or data warehouse. Develop and implement data auditing strategies and processes to ensure data quality; identify and resolve problems associated with large-scale data processing workflows; implement technical solutions to maintain data pipeline processes and troubleshoot failures. Collaborate with technology teams and partners to specify data requirements and provide access to data. Tune application and query performance using profiling tools and SQL or other relevant query languages. Understand business, operations, and analytics requirements for data Build data expertise and own data quality for assigned areas of ownership Work with data infrastructure to triage issues and drive to resolution Required Qualifications Bachelor’s Degree in Data Science, Data Analytics, Information Management, Computer Science, Information Technology, related field, or equivalent professional experience Overall experience should be more than 4 + years 3+ years of experience working with SQL 3+ years of experience in implementing modern data architecture-based data warehouses 2+ years of experience working with data warehouses such as Redshift, BigQuery, or Snowflake and understand data architecture design Excellent software engineering and scripting knowledge Strong communication skills (both in presentation and comprehension) along with the aptitude for thought leadership in data management and analytics Expertise with data systems working with massive data sets from various data sources Ability to lead a team of Data Engineers Preferred Qualifications Experience working with time series databases Advanced knowledge of SQL, including the ability to write stored procedures, triggers, analytic/windowing functions, and tuning Advanced knowledge of Snowflake, including the ability to write and orchestrate streams and tasks Background in Big Data, non-relational databases, Machine Learning and Data Mining Experience with cloud-based technologies including SNS, SQS, SES, S3, Lambda, and Glue Experience with modern data platforms like Redshift, Cassandra, DynamoDB, Apache Airflow, Spark, or ElasticSearch Expertise in Data Quality and Data Governance Our Core Values Data Fanatics: Our edge is always found in the data Partner Obsessed: We are obsessed with partner success Team of Doers: We have a bias for action Game Changers: We encourage innovation About Pattern Pattern is the premier partner for global e-commerce acceleration and is headquartered in Utah's Silicon Slopes tech hub—with offices in Asia, Australia, Europe, the Middle East, and North America. Valued at $2 billion, Pattern has been named one of the fastest-growing tech companies in North America by Deloitte and one of the best-led companies in America by Inc. More than 100 global brands—like Nestle, Sylvania, Kong, Panasonic, and Sorel —rely on Pattern's global e-commerce acceleration platform to scale their business around the world. We place employee experience at the center of our business model and have been recognized as one of America's Most Loved Workplaces®. https://pattern.com/
Posted 1 week ago
4.0 - 9.0 years
3 - 7 Lacs
Pune
Work from Office
APPLICATION DEVELOPER More than 4 years experience in Discovery / ITOM domain on ServiceNow platform.Very good implementation knowledge of ServiceNow Discovery, Service Mapping, Certificate Management, AI OPSGood understanding & hands on and AWS & Azure cloud Good to have knowledge and working experience of other ITOM modules like Event Management & Cloud Managment Good knowledge on configuring / troubleshooting skills on IT Infrastructure, Server, Network, Storage, Cloud and worked as a System / Platform administrator Good knowledge of middleware solutions such as IIS, WebLogic, Websphere, Apache Tomcat, etc Worked in a Datacenter environment as L3 / L4 support Technical knowledge of the following areasJava, HTML, JavaScript, LDAP/Active Directory Working knowledge of configuration management, CMDB, methods and processes Excellent verbal and written communication skills, including polished presentation skills with the ability to deliver technical issues to both technical and non-technical audiences in a clear and understandable manner. Provide excellent customer service, leadership, communication, problem solving and decision-making skills. Nice to haveBasic knowledge and awareness of various other modules like ITBM and security Must have flare for research, innovation and must be passionate to work on innovation topics .
Posted 1 week ago
0.0 - 3.0 years
0 - 0 Lacs
Chandigarh, Chandigarh
On-site
Preferred candidate profile We are looking for a Linux Administrator at least 3 years of experience who will be responsible for installation and configuration of webserver and database server. The ideal candidate should have knowledge of deployment of websites. who will be responsible for designing, implementing, and monitoring the infrastructure and knowledge of docker ,CI/CD. 1.In depth knowledge of Linux: RedHat, CentOS, Debian, etc. 2.Solid knowledge of installation and configuration of Webserver(Nginx or Apache), database server(MySQL, PostgreSQL, MongoDB). 3. knowledge about the cloud service like AWS, Azure, Digitalocean. 4. knowledge about the networking like Switches, router, firewall. 5. Knowledge of docker, CI/CD and terraform 6 Knowledge of deployments of website in different language like PHP, NodeJS, python on production server 7. Experience in deployment of Webserver, PHP, Node and Python Job Type: Full-time Pay: ₹20,000.00 - ₹30,000.00 per month Benefits: Food provided Health insurance Life insurance Paid sick time Paid time off
Posted 1 week ago
0 years
0 Lacs
Hyderabad, Telangana, India
On-site
Immediate #HIRING for a highly motivated and experienced GCP Data Engineer to join our growing team. We’re a leading software company specializing in Artificial Intelligence, Machine Learning, Data Analytics, Innovative data solutions, Cloud-based technologies If you're passionate about building robust applications and thrive in a dynamic environment, please share your resume a t rizwana@randomtrees.com Job Title: GCP Data Engineer Experience : 4 Yrs - 8Yrs Notice: Immediate Location: Hyderabad/ Chennai - Hybrid Mode Job Type: Full-time Employment Job Description: We are looking for an experienced GCP Data Engineer to design, develop, and optimize data pipelines and solutions on Google Cloud Platform (GCP) . The ideal candidate should have hands-on experience with BigQuery, DataFlow, PySpark, GCS, and Airflow (Cloud Composer) , along with strong expertise or knowledge in DBT. Key Responsibilities: Design and develop scalable ETL/ELT data pipelines using DataFlow (Apache Beam), PySpark, and Airflow (Cloud Composer) . Work extensively with BigQuery for data transformation, storage, and analytics. Implement data ingestion, processing, and transformation workflows using GCP-native services. Optimize and troubleshoot performance issues in BigQuery and DataFlow pipelines. Manage data storage and governance using Google Cloud Storage (GCS) and other GCP services. Ensure data quality, security, and compliance with industry standards. Work closely with data scientists, analysts, and business teams to provide data solutions. Automate workflows, monitor jobs, and improve pipeline efficiency. Required Skills: ✔ Google Cloud Platform (GCP) Data Engineering (GCP DE Certification preferred) DBT knowledge or experience is mandate ✔ BigQuery – Data modeling, query optimization, and performance tuning ✔ PySpark – Data processing and transformation ✔ GCS (Google Cloud Storage) – Data storage and management ✔ Airflow / Cloud Composer – Workflow orchestration and scheduling ✔ SQL & Python – Strong hands-on experience ✔ Experience with CI/CD pipelines, Terraform, or Infrastructure as Code (IaC) is a plus.
Posted 1 week ago
4.0 - 7.0 years
10 - 14 Lacs
Bengaluru
Work from Office
We are seeking a skilled Golang Developer with 4+ years of hands-on software development experience. The ideal candidate will possess strong Go programming capabilities, deep knowledge of Linux internals, and experience working with service-oriented and microservice architectures. Key Responsibilities: 4+ years of Software development experience Good Go implementation capabilities. Understanding of different design principles. Good understanding of Linux OS - memory, instruction processing, filesystem, system daemons etc. Fluent with linux command line and shell scripting. Working knowledge of servers (nginx, apache, etc.), proxy-servers, and load balancing. Understanding of service based architecture and microservices. Knowledge of AV codecs, MpegTS and adaptive streaming like Dash, HLS. Good understanding of computer networking concepts. Working knowledge of relational Databases Good analytical and debugging skills. Knowledge of git or any other source code management Good to Have Skills: Working knowledge of Core Java and Python are preferred. Exposure to cloud computing is preferred. Exposure to API or video streaming performance testing is preferred. Preferred experience in Elasticsearch and Kibana (ELK Stack) Proficiency in at least one modern web front-end development framework such as React JS will be a bonus Preferred experience with messaging systems like RabbitMQ
Posted 1 week ago
175.0 years
0 Lacs
Gurgaon, Haryana, India
On-site
At American Express, our culture is built on a 175-year history of innovation, shared values and Leadership Behaviors, and an unwavering commitment to back our customers, communities, and colleagues. As part of Team Amex, you'll experience this powerful backing with comprehensive support for your holistic well-being and many opportunities to learn new skills, develop as a leader, and grow your career. Here, your voice and ideas matter, your work makes an impact, and together, you will help us define the future of American Express. How will you make an impact in this role? Expertise with handling large volumes of data coming from many different disparate systems Expertise with Core Java , multithreading , backend processing , transforming large data volumes Working knowledge of Apache Flink , Apache Airflow , Apache Beam, open source data processing platforms Working knowledge of cloud platforms like GCP. Working knowledge of databases and performance tuning for complex big data scenarios - Singlestore DB and In Memory Processing Cloud Deployments , CI/CD and Platform Resiliency Good experience with Mvel Excellent communication skills , collaboration mindset and ability to work through unknowns Work with key stakeholders to drive data solutions that align to strategic roadmaps, prioritized initiatives and strategic Technology directions. Own accountability for all quality aspects and metrics of product portfolio, including system performance, platform availability, operational efficiency, risk management, information security, data management and cost effectiveness. Minimum Qualifications: Bachelor’s degree in computer science, Computer Science Engineering, or related field is required. 3+ years of large-scale technology engineering and formal management in a complex environment and/or comparable experience. To be successful in this role you will need to be good in Java, Flink, SQ, KafkaL & GCP Successful engineering and deployment of enterprise-grade technology products in an Agile environment. Large scale software product engineering experience with contemporary tools and delivery methods (i.e. DevOps, CD/CI, Agile, etc.). 3+ years' experience in a hands-on engineering in Java and data/distributed eco-system. Ability to see the big picture with attention given to critical details. Preferred Qualifications: Knowledge on Kafka, Spark Finance domain knowledge We back you with benefits that support your holistic well-being so you can be and deliver your best. This means caring for you and your loved ones' physical, financial, and mental health, as well as providing the flexibility you need to thrive personally and professionally: Competitive base salaries Bonus incentives Support for financial-well-being and retirement Comprehensive medical, dental, vision, life insurance, and disability benefits (depending on location) Flexible working model with hybrid, onsite or virtual arrangements depending on role and business need Generous paid parental leave policies (depending on your location) Free access to global on-site wellness centers staffed with nurses and doctors (depending on location) Free and confidential counseling support through our Healthy Minds program Career development and training opportunities American Express is an equal opportunity employer and makes employment decisions without regard to race, color, religion, sex, sexual orientation, gender identity, national origin, veteran status, disability status, age, or any other status protected by law. Offer of employment with American Express is conditioned upon the successful completion of a background verification check, subject to applicable laws and regulations.
Posted 1 week ago
10.0 years
0 Lacs
Ahmedabad, Gujarat, India
On-site
Job purpose: Design, develop, and deploy end-to-end AI/ML systems, focusing on large language models (LLMs), prompt engineering, and scalable system architecture. Leverage technologies such as Java/Node.js/NET to build robust, high-performance solutions that integrate with enterprise systems. Who You Are: Education: Bachelor’s or Master’s degree in Computer Science, Engineering, or a related field. PhD is a plus. 10+ years of experience in AI/ML development, with at least 2 years working on LLMs or NLP. Proven expertise in end-to-end system design and deployment of production-grade AI systems. Hands-on experience with Java/Node.js/.NET for backend development. Proficiency in Python and ML frameworks (TensorFlow, PyTorch, Hugging Face Transformers). Key Responsibilities: 1. Model Development & Training: Design, train, and fine-tune large language models (LLMs) for tasks such as natural language understanding, generation, and classification. Implement and optimize machine learning algorithms using frameworks like TensorFlow, PyTorch, or Hugging Face. 2. Prompt Engineering: Craft high-quality prompts to maximize LLM performance for specific use cases, including chatbots, text summarization, and question-answering systems. Experiment with prompt tuning and few-shot learning techniques to improve model accuracy and efficiency. 3. End-to-End System Design: Architect scalable, secure, and fault-tolerant AI/ML systems, integrating LLMs with backend services and APIs. Develop microservices-based architectures using Java/Node.js/.NET for seamless integration with enterprise applications. Design and implement data pipelines for preprocessing, feature engineering, and model inference. 4. Integration & Deployment: Deploy ML models and LLMs to production environments using containerization (Docker, Kubernetes) and cloud platforms (AWS/Azure/GCP). Build RESTful or GraphQL APIs to expose AI capabilities to front-end or third-party applications. 5. Performance Optimization: Optimize LLMs for latency, throughput, and resource efficiency using techniques like quantization, pruning, and model distillation. Monitor and improve system performance through logging, metrics, and A/B testing. 6. Collaboration & Leadership: Work closely with data scientists, software engineers, and product managers to align AI solutions with business objectives. Mentor junior engineers and contribute to best practices for AI/ML development. What will excite us: Strong understanding of LLM architectures and prompt engineering techniques. Experience with backend development using Java/Node.js (Express)/.NET Core. Familiarity with cloud platforms (AWS, Azure, GCP) and DevOps tools (Docker, Kubernetes, CI/CD). Knowledge of database systems (SQL, NoSQL) and data pipeline tools (Apache Kafka, Airflow). Strong problem-solving and analytical skills. Excellent communication and teamwork abilities. Ability to work in a fast-paced, collaborative environment. What will excite you: Lead AI innovation in a fast-growing, technology-driven organization. Work on cutting-edge AI solutions, including LLMs, autonomous AI agents, and Generative AI applications. Engage with top-tier enterprise clients and drive AI transformation at scale. Location: Ahmedabad
Posted 1 week ago
1.0 years
0 Lacs
Bengaluru, Karnataka, India
On-site
About Role It is an exciting opportunity to work with global team to act as backbone of all reporting and analysis for the P&C Finance team. Roles And Responsibilities We're looking for someone who enjoys working with data and is comfortable wearing multiple hats — from working in raw messy files to shaping dashboards people can actually use. Here's what the day-to-day may look like: Collaborating with finance and non-finance teams to retrieve data timely across systems and divisions. Designing and building robust data pipelines in Palantir Foundry — working with large datasets and Foundry Ontology models. Using PySpark and Apache-based logic to enrich, align, and transform raw data from multiple source systems into streamlined data models. Supporting and enhancing existing reporting platforms, particularly in Power BI, by updating datasets, fixing DAX, or adjusting visuals as per stakeholder needs. Building new reporting tools or dashboards that help visualize financial and operational data clearly and efficiently. Constantly looking for ways to automate manual reporting tasks — whether via data flows, transformation logic, or re-usable queries. Working closely with stakeholders (finance, ops, and others) to understand their problems, resolve data queries, and offer practical, scalable solutions. Taking ownership of reporting problems with a solution-first mindset — if something breaks, you're the type who dives in to figure out why and how to fix it. About You You don’t need years of experience — but you do need curiosity, ownership, and a willingness to learn fast. This could be a perfect fit if: You’re a fresher or someone with 6 months to 1 year experience, ideally in a data, analytics, or reporting-heavy role. You’re comfortable with SQL and Python, and you've built things using Advanced Excel, Power Query, or Power BI. You’ve written some DAX logic, or are excited to learn more about how to shape metrics and KPIs. You like working with big messy datasets and finding ways to clean and align them so others can use them with confidence. You’re comfortable talking to business users, not just writing code — and can explain your logic without needing to “sound technical”. Maybe you’ve worked on a college or internship project where you pulled together data from different places and made something useful. That’s great. Prior experience with Palantir Foundry, or working with finance data, is a big plus We're more interested in how you think and solve problems than just checking boxes. So if you're eager to learn, open to feedback, and enjoy finding insights in data — we’d love to hear from you. Nice to Have (but not mandatory) These Aren’t Must-haves, But If You’ve Worked On Any Of The Following, It’ll Definitely Make You Stand Out You've written user-defined functions (UDFs) in PySpark to make your transformation logic reusable and cleaner across multiple pipelines. You try to follow systematic coding practices — like organizing logic into steps, adding meaningful comments, or handling edge cases cleanly. You’ve worked with version control (Git or similar), and understand how to manage updates to code or revert changes if something breaks. You care about performance optimization — like reducing pipeline runtime, minimizing joins, or improving how fast visuals load in tools like Power BI. You’re comfortable thinking not just about “how to get it to work” but also “how to make it better, faster, and easier to maintain.” About Swiss Re Swiss Re is one of the world’s leading providers of reinsurance, insurance and other forms of insurance-based risk transfer, working to make the world more resilient. We anticipate and manage a wide variety of risks, from natural catastrophes and climate change to cybercrime. We cover both Property & Casualty and Life & Health. Combining experience with creative thinking and cutting-edge expertise, we create new opportunities and solutions for our clients. This is possible thanks to the collaboration of more than 14,000 employees across the world. Our success depends on our ability to build an inclusive culture encouraging fresh perspectives and innovative thinking. We embrace a workplace where everyone has equal opportunities to thrive and develop professionally regardless of their age, gender, race, ethnicity, gender identity and/or expression, sexual orientation, physical or mental ability, skillset, thought or other characteristics. In our inclusive and flexible environment everyone can bring their authentic selves to work and their passion for sustainability. If you are an experienced professional returning to the workforce after a career break, we encourage you to apply for open positions that match your skills and experience. Keywords Reference Code: 134825
Posted 1 week ago
0 years
0 Lacs
Pune, Maharashtra, India
On-site
Our Purpose Mastercard powers economies and empowers people in 200+ countries and territories worldwide. Together with our customers, we’re helping build a sustainable economy where everyone can prosper. We support a wide range of digital payments choices, making transactions secure, simple, smart and accessible. Our technology and innovation, partnerships and networks combine to deliver a unique set of products and services that help people, businesses and governments realize their greatest potential. Title And Summary Lead Software Engineer (Java, Spring boot, Cloud) Overview Who is Mastercard? Mastercard is a global technology company in the payments industry. Our mission is to connect and power an inclusive, digital economy that benefits everyone, everywhere by making transactions safe, simple, smart, and accessible. Using secure data and networks, partnerships and passion, our innovations and solutions help individuals, financial institutions, governments, and businesses realize their greatest potential. Our decency quotient, or DQ, drives our culture and everything we do inside and outside of our company. With connections across more than 210 countries and territories, we are building a sustainable world that unlocks priceless possibilities for all. The Fraud Products team (part of O&T) is developing new capabilities for MasterCard's Decision Management Platform, which serves as the core for multiple business solutions to combat fraud and validate cardholder identity. Our patented Java-based platform processes billions of transactions per month in tens of milliseconds using a multi-tiered, message-oriented approach for high performance and availability. MasterCard software engineering teams leverage Agile development principles, advanced development and design practices, and an obsession over security, reliability, and performance to deliver solutions that delight our customers. We're looking for talented software development engineers and architects to develop advanced technologies and applications that are revolutionizing payments. Would you like to develop industry leading solutions for fighting fraud? Are you motivated by speeding business solutions to market? Do you want to innovate, using cutting edge technologies on challenging business problems? Do you want to work for a company that offers above and beyond benefits including paid parental leave, flexible work hours, gift matching, and even volunteer incentives while encouraging your own professional learning and development? Do you thrive in a place where you are continuously learning more while growing your skills and career? Role Successfully lead major projects and complex assignments with broad scope and long-term business implications. Work closely with other technical leads on assigned projects to assist in design and implementation tasks. Assist with production support issues by acting as a subject matter expert in resolving incidents and problem tickets. Plan, design and develop technical solutions and alternatives to meet business requirements in adherence with MasterCard standards, processes, and best practices. Participate in PoCs (Proof of Concept) and help the Department with selection of Vendor Solutions, Technologies, Methodologies and Frameworks. Design and build technical roadmaps to optimize services and functions with a focus on performance and cost/benefit optimization Conduct brownbag sessions on new and upcoming technologies, methodologies, and application appropriate frameworks. Actively look for opportunities to enhance standards and improve process efficiency. Be an integral part of the Agile SAFe discover and elaboration sessions. Perform requirements and design reviews, peer code reviews and PCI security reviews to ensure compliance with MasterCard standards. Have strong ownership of your team’s software and are deep in the maintenance characteristics, runtime properties and dependencies including hardware, operating system, and build. Communicate, collaborate, and work effectively in a global environment. Public speaking as a technology evangelist for Mastercard. All About You Must be high-energy, detail-oriented, proactive and can function under pressure in an independent environment. Must provide the necessary skills to have a high degree of initiative and self-motivation to drive results. Possesses strong communication skills -- both verbal and written – and strong relationship, collaborative skills, and organizational skills. Willingness and ability to learn and take on challenging opportunities and to work as a member of matrix based diverse and geographically distributed project team. Knowledge of software development processes including agile processes and test-driven development Experience with the design and development of complex, multi-tier cloud native architectures. Degree in Computer Science or related field Essential Skills Required Technical experience using Java/J2EE Spring Framework (including Spring Boot) Distributed Computing at scale Cloud technologies like cloud foundry, Kubernetes Strong Linux and shell scripting Oracle & PL/SQL and advanced SQL scripting IDE such as JBoss Developer Studio/IntelliJ Desirable Skills Experience working in at-scale distributed compute such as Gemfire, Apache Spark, Distributed Redis, Hazelcast, GridGain or similar Messaging – MQ and JMS Experience integrating vendor and open-source products into an overall system Corporate Security Responsibility All activities involving access to Mastercard assets, information, and networks comes with an inherent risk to the organization and, therefore, it is expected that every person working for, or on behalf of, Mastercard is responsible for information security and must: Abide by Mastercard’s security policies and practices; Ensure the confidentiality and integrity of the information being accessed; Report any suspected information security violation or breach, and Complete all periodic mandatory security trainings in accordance with Mastercard’s guidelines.
Posted 1 week ago
0 years
0 Lacs
Pune, Maharashtra, India
On-site
Our Purpose Mastercard powers economies and empowers people in 200+ countries and territories worldwide. Together with our customers, we’re helping build a sustainable economy where everyone can prosper. We support a wide range of digital payments choices, making transactions secure, simple, smart and accessible. Our technology and innovation, partnerships and networks combine to deliver a unique set of products and services that help people, businesses and governments realize their greatest potential. Title And Summary Dev Engineer Overview Be part of the Operations & Technology Fraud Products team developing new capabilities for MasterCard's Decision Management Platform, which serves as the core for multiple business solutions to combat fraud and validate cardholder identity. Our patented Java-based platform processes billions of transactions per month in tens of milliseconds using a multi-tiered, message-oriented approach for high performance and availability. Would you like to develop industry leading solutions for fighting fraud? Are you motivated by speeding business solutions to market? Do you want to innovate, using cutting edge technologies on challenging business problems? Role Deliver solutions by providing direct development of software. Work closely with technical leads for assigned projects to assist in design and implementation tasks Assist with production support issues by acting as a subject matter expert in resolving incidents and problem tickets. Plan, design and develop technical solutions and alternatives to meet business requirements in adherence with Mastercard standards, processes and best practices. Lead day to day system development and maintenance activities of the team to meet service level agreements (SLAs) and create solutions with high level of innovation, cost effectiveness, high quality and faster time to market. Accountable for full systems development life cycle including creating high quality requirements documents, use-cases, design and other technical artifacts including but not limited to detailed test strategy/test design, performance benchmarking, release rollout and deployment plans, contingency/back-out plans, feasibility study, cost and time analysis and detailed estimates. Participate in PoCs (Proof of Concept) and help the Department with selection of Vendor Solutions, Technologies, Methodologies and Frameworks. Conduct brownbag sessions on new and upcoming technologies, methodologies and application appropriate frameworks. Ensure knowledge transfer of vendor technology to Mastercard staff. Provide technical training to the other team members. Actively look for opportunities to enhance standards and improve process efficiency. Mentor and guide other team members during all phases of the SDLC. Ensure adequate test coverage in Unit Testing, System Testing/Integration Testing and Performance Testing. Perform Quality Inspections and Walkthroughs throughout the SDLC including Requirements Review, Design Review, Code Review and Security Review to ensure compliance with Mastercard standards. All About You Must be high-energy, detail-oriented, proactive and have the ability to function under pressure in an independent environment. Must provide the necessary skills to have a high degree of initiative and self-motivation to drive results. Possesses strong communication skills -- both verbal and written – and strong relationship, collaborative skills and organizational skills. Willingness and ability to learn and take on challenging opportunities and to work as a member of matrix based diverse and geographically distributed project team. Good knowledge of Agile software development processes. Experience with the design and development of complex, multi-tier software solutions. Comfortable working in a Linux environment, using VI editor and general command line proficiency Essential Skills: ○ Creating and debugging J2EE REST Web Services and Web Applications ○ Database experience including Oracle and SQL scripting ○ Experience with Spring Framework (including Spring Boot) and Maven ○ Experience writing unit tests with Junit and Mockito ○ Experience working with Apache Tomcat ○ Experience with Git Desirable skills ○ Experience working with containerised environments, such as Kubernetes/OpenShift/CloudFoundry ○ Experience with integration frameworks such as Apache Camel/Spring Integration ○ Experience with monitoring service performance ○ Experience with Angular or modern SPA frameworks such as React + Redux. Experience with HTML5, ES5+ES6 and/or Typescript, SASS and CSS3. Corporate Security Responsibility All activities involving access to Mastercard assets, information, and networks comes with an inherent risk to the organization and, therefore, it is expected that every person working for, or on behalf of, Mastercard is responsible for information security and must: Abide by Mastercard’s security policies and practices; Ensure the confidentiality and integrity of the information being accessed; Report any suspected information security violation or breach, and Complete all periodic mandatory security trainings in accordance with Mastercard’s guidelines.
Posted 1 week ago
0 years
0 Lacs
Chennai, Tamil Nadu, India
On-site
Job Description : We are seeking a highly skilled and motivated Java Developer to join our development team. The ideal candidate will have strong experience in building high-performing, scalable, enterprise-grade applications. You will be responsible for Java application development while providing expertise in the full software development lifecycle. Responsibilities : • Understand integration workflows, architectures, development process, deployment process, and support process. • Develop/deliver/support integration modules/services (API services, integration adapters on existing platform(AWScloud, AWS API Gateway, etc.) • Develop unit test and integration test cases to make sure integration flow works as required. • Monitor integration workflow and perform analysis of incident, defect, bug, issue on integration area. • Good knowledge in software development practices and be able to apply design principles to code. • Good sense of urgency, able to prioritize works appropriately. Understand and adopt changes quickly and reasonably. • Willing to work in team, able to communicate efficiently and concise. • Enjoy optimizing everything from how your code is compiled to how it scales across servers to provide the best end-user experience. • Able to coach others and initiate innovative ideas (senior role) Qualifications : • Strong in Java programming language and Java’s framework (Spring, Apache Camel, etc.) • Good experience in software integration area (Middle & Senior Level), or willing to learn software integration. • Experience in event messaging including Apache Kafka, JMS, Apache Message Queue, Rabbit MQ, AWSSQS, AWSKinesis, etc. • Experience in Git, AWS Cloud and other AWS services. • Good experience in developing web service both REST API, SOAP API, and API security (certificate, OAuth 2, basic authentication, etc.). • Experience in using ELK, or another Application Log management (Splunk). • Able to influence and drive projects to meet key milestones and overcome challenges comfortable working.
Posted 1 week ago
5.0 years
0 Lacs
Gurugram, Haryana, India
On-site
Our Purpose Mastercard powers economies and empowers people in 200+ countries and territories worldwide. Together with our customers, we’re helping build a sustainable economy where everyone can prosper. We support a wide range of digital payments choices, making transactions secure, simple, smart and accessible. Our technology and innovation, partnerships and networks combine to deliver a unique set of products and services that help people, businesses and governments realize their greatest potential. Title And Summary Senior Analyst, Big Data Analytics & Engineering Overview Job Title: Sr. Analyst, Data Engineering, Value Quantification Team (Based in Pune, India) About Mastercard Mastercard is a global technology leader in the payments industry, committed to powering an inclusive, digital economy that benefits everyone, everywhere. By leveraging secure data, cutting-edge technology, and innovative solutions, we empower individuals, financial institutions, governments, and businesses to achieve their potential. Our culture is driven by our Decency Quotient (DQ), ensuring inclusivity, respect, and integrity guide everything we do. Operating across 210+ countries and territories, Mastercard is dedicated to building a sustainable world with priceless opportunities for all. Position Overview This is a techno-functional position that combines strong technical skills with a deep understanding of business needs and requirements with 5-7 years or experience. The role focuses on developing and maintaining advanced data engineering solutions for pre-sales value quantification within the Services business unit. As a Sr. Analyst, you will be responsible for creating and optimizing data pipelines, managing large datasets, and ensuring the integrity and accessibility of data to support Mastercard’s internal teams in quantifying the value of services, enhancing customer engagement, and driving business outcomes. The role requires close collaboration across teams to ensure data solutions meet business needs and deliver measurable impact. Role Responsibilities Data Engineering & Pipeline Development: Develop and maintain robust data pipelines to support the value quantification process. Utilize tools such as Apache NiFi, Azure Data Factory, Pentaho, Talend, SSIS, and Alteryx to ensure efficient data integration and transformation. Data Management and Analysis: Manage and analyze large datasets using SQL, Hadoop, and other database management systems. Perform data extraction, transformation, and loading (ETL) to support value quantification efforts. Advanced Analytics Integration: Use advanced analytics techniques, including machine learning algorithms, to enhance data processing and generate actionable insights. Leverage programming languages such as Python (Pandas, NumPy, PySpark) and Impala for data analysis and model development. Business Intelligence and Reporting: Utilize business intelligence platforms such as Tableau and Power BI to create insightful dashboards and reports that communicate the value of services. Generate actionable insights from data to inform strategic decisions and provide clear, data-backed recommendations. Cross-Functional Collaboration & Stakeholder Engagement: Collaborate with Sales, Marketing, Consulting, Product, and other internal teams to understand business needs and ensure successful data solution development and deployment. Communicate insights and data value through compelling presentations and dashboards to senior leadership and internal teams, ensuring tool adoption and usage. All About You Data Engineering Expertise: Proficiency in data engineering tools and techniques to develop and maintain data pipelines. Experience with data integration tools such as Apache NiFi, Azure Data Factory, Pentaho, Talend, SSIS, and Alteryx. Advanced SQL Skills: Strong skills in SQL for querying and managing large datasets. Experience with database management systems and data warehousing solutions. Programming Proficiency: Knowledge of programming languages such as Python (Pandas, NumPy, PySpark) and Impala for data analysis and model development. Business Intelligence and Reporting: Experience in creating insightful dashboards and reports using business intelligence platforms such as Tableau and Power BI. Statistical Analysis: Ability to perform statistical analysis to identify trends, correlations, and insights that support strategic decision-making. Cross-Functional Collaboration: Strong collaboration skills to work effectively with Sales, Marketing, Consulting, Product, and other internal teams to understand business needs and ensure successful data solution development and deployment. Communication and Presentation: Excellent communication skills to convey insights and data value through compelling presentations and dashboards to senior leadership and internal teams. Execution Focus: A results-driven mindset with the ability to balance strategic vision with tactical execution, ensuring that data solutions are delivered on time and create measurable business value. Education Bachelor’s degree in Data Science, Computer Science, Business Analytics, Economics, Finance, or a related field. Advanced degrees or certifications in analytics, data science, AI/ML, or an MBA are preferred. Why Us? At Mastercard, you’ll have the opportunity to shape the future of internal operations by leading the development of data engineering solutions that empower teams across the organization. Join us to make a meaningful impact, drive business outcomes, and help Mastercard’s internal teams create better customer engagement strategies through innovative value-based ROI narratives. Location: Gurgaon/Pune, India Employment Type: Full-Time Corporate Security Responsibility All activities involving access to Mastercard assets, information, and networks comes with an inherent risk to the organization and, therefore, it is expected that every person working for, or on behalf of, Mastercard is responsible for information security and must: Abide by Mastercard’s security policies and practices; Ensure the confidentiality and integrity of the information being accessed; Report any suspected information security violation or breach, and Complete all periodic mandatory security trainings in accordance with Mastercard’s guidelines.
Posted 1 week ago
3.0 - 8.0 years
0 Lacs
Indore, Madhya Pradesh, India
On-site
Role Expectations: Perform functional, performance, and load testing of web applications using tools such as JMeter and Postman. Develop, maintain, and execute automated test scripts using Selenium with Java for web application testing. Design and implement tests for RESTful APIs using REST Assured (Java library) for testing HTTP responses and ensuring proper API functionality. Collaborate with development teams to identify and resolve software defects through effective debugging and testing. Utilize the Robot Framework with Python for acceptance testing and acceptance test-driven development. Conduct end-to-end testing and ensure that systems meet all functional requirements. Ensure quality and compliance of software releases by conducting thorough test cases and evaluating product quality. Qualifications: Postman API Testing: Experience in testing RESTful APIs and web services using Postman. Experience Range 3 to 8 years Java: Strong knowledge of Java for test script development, particularly with Selenium and REST Assured. JMeter: Experience in performance, functional, and load testing using Apache JMeter. Selenium with Java: Expertise in Selenium WebDriver for automated functional testing, including script development and maintenance using Java. REST Assured: Proficient in using the REST Assured framework (Java library) for testing REST APIs and validating HTTP responses. Robot Framework: Hands-on experience with the Robot Framework for acceptance testing and test-driven development (TDD) in Python. Networking Knowledge: Deep understanding of networking concepts, specifically around RAN elements and network architectures (ORAN, SMO, RIC, OSS). ORAN/SMO/RIC/OSS Architecture: In-depth knowledge of ORAN (Open Radio Access Network), SMO (Service Management Orchestration), RIC (RAN Intelligent Controller), and OSS (Operations Support Systems) architectures. Monitoring Tools: Experience with Prometheus, Grafana, and Kafka for real-time monitoring and performance tracking of applications and systems. Keycloak: Familiarity with Keycloak for identity and access management.
Posted 1 week ago
Upload Resume
Drag or click to upload
Your data is secure with us, protected by advanced encryption.
Browse through a variety of job opportunities tailored to your skills and preferences. Filter by location, experience, salary, and more to find your perfect fit.
We have sent an OTP to your contact. Please enter it below to verify.
Accenture
39817 Jobs | Dublin
Wipro
19388 Jobs | Bengaluru
Accenture in India
15459 Jobs | Dublin 2
EY
14907 Jobs | London
Uplers
11185 Jobs | Ahmedabad
Amazon
10459 Jobs | Seattle,WA
IBM
9256 Jobs | Armonk
Oracle
9226 Jobs | Redwood City
Accenture services Pvt Ltd
7971 Jobs |
Capgemini
7704 Jobs | Paris,France