Home
Jobs

8977 Relational Jobs

Filter Interviews
Min: 0 years
Max: 25 years
Min: ₹0
Max: ₹10000000
Setup a job Alert
Filter
JobPe aggregates results for easy application access, but you actually apply on the job portal directly.

7.0 years

5 - 15 Lacs

Chandigarh

On-site

GlassDoor logo

Role Summary Lead the architecture, design, and engineering of our rewards and voucher redemption platform. You’ll own critical modules like wallet services, voucher integrations, API‑first design, security, compliance, and platform scalability—shaping how corporates distribute digital rewards at scale. Responsibilities Design and build microservices for points wallet, voucher catalogue, redemption flow, and campaign pipelines Integrate with voucher aggregators, messaging APIs (SMS/Email/WhatsApp), HRMS/Analytics systems Ensure PCI-DSS, GDPR/privacy, and internal security best practices Lead dev team, define reusable components, enforce code standards Collaborate with PM and UX to deliver robust APIs and real-time systems under <2 sec latency Ideal Experience 7+ years in SaaS/Fintech platforms (wallets, loyalty, voucher engines) Strong in Java/Kotlin, Node.js, Go, or similar backend stacks, plus React/Flutter for front-end Designed scalable systems (500k+ users), used microservices, message queues, caching Hands-on with relational & NoSQL databases, CI/CD pipelines, monitoring (Prometheus, Grafana) Knowledge of payments compliance in India (RBI e-gift rules, data localization) Job Type: Full-time Pay: ₹578,469.27 - ₹1,500,000.00 per year Schedule: Day shift Work Location: In person Speak with the employer +91 9619171318 Application Deadline: 13/07/2025 Expected Start Date: 15/07/2025

Posted 21 hours ago

Apply

3.0 - 5.0 years

0 Lacs

Cochin

Remote

GlassDoor logo

We are seeking a knowledgeable and experienced Microsoft SQL Trainer to deliver comprehensive training programs on Microsoft SQL Server. The trainer will be responsible for equipping learners with the skills to query, manage, and administer SQL databases efficiently. The role involves curriculum development, hands-on demonstrations, and guiding learners through real-time projects. Key Responsibilities: Design and deliver training sessions on Microsoft SQL Server (T-SQL, DDL/DML, stored procedures, indexing, performance tuning, etc.) Teach database fundamentals, SQL query writing, database design, and optimization techniques. Develop training materials, manuals, exercises, and assessments. Conduct workshops for beginners to advanced-level learners. Provide hands-on experience through projects and lab sessions. Evaluate student performance and provide constructive feedback. Stay updated with the latest updates and versions of SQL Server and incorporate them into training. Customize training programs to suit various industry or domain needs. Mentor students for certification exams (such as Microsoft Certified: Azure Data Fundamentals / Database Administrator Associate). Required Skills & Qualifications: Strong experience in Microsoft SQL Server and T-SQL (minimum 3–5 years preferred). Proficient in database development, data modeling, optimization, and administration. Experience in tools like SSMS, Azure Data Studio, and SQL Profiler. Good understanding of relational database concepts and normalization. Prior experience in teaching/training is highly desirable. Excellent communication and presentation skills. Ability to explain complex concepts in a simple and engaging manner. Preferred Qualifications: Microsoft certifications (e.g., MCSA: SQL Server, Azure Data Engineer, etc.). Exposure to cloud-based SQL solutions like Azure SQL Database. Knowledge of Power BI or integration with reporting tools is a plus. Work Environment: Flexible work hours (for remote/part-time roles) Interactive classroom or online training sessions Continuous learning and upskilling environment. Job Types: Part-time, Freelance Schedule: Day shift Evening shift Fixed shift Monday to Friday Morning shift Night shift Rotational shift Weekend availability Work Location: In person

Posted 21 hours ago

Apply

4.0 - 6.0 years

3 Lacs

Thiruvananthapuram

On-site

GlassDoor logo

Job Requirements Quest Global is an organization at the forefront of innovation and one of the world’s fastest growing engineering services firms with deep domain knowledge and recognized expertise in the top OEMs across seven industries. We are a twenty-five-year-old company on a journey to becoming a centenary one, driven by aspiration, hunger and humility. We are looking for humble geniuses, who believe that engineering has the potential to make the impossible, possible; innovators, who are not only inspired by technology and innovation, but also perpetually driven to design, develop, and test as a trusted partner for Fortune 500 customers. As a team of remarkably diverse engineers, we recognize that what we are really engineering is a brighter future for us all. If you want to contribute to meaningful work and be part of an organization that truly believes when you win, we all win, and when you fail, we all learn, then we’re eager to hear from you. The achievers and courageous challenge-crushers we seek, have the following characteristics and skills: Roles & Responsibilities: Design and develop and maintain scalable and efficient data pipeline. Data engineering, ETL development and data integration. Development data pipeline using Python. Visualization of data in Kibana dashboards. SQL skills and working with relational databases. Knowledge of cloud-based data solutions and services. Take ownership of assigned jobs that are part of new feature implementations, bug fixes and enhancement activities. Document the projects according to project standards (Architecture, technical specifications) Technical communication with internal and external stake holders. Works independently and contributes to the immediate team and work with Architects and other leads. Work Experience Required Skills (Technical Competency): Bachelor’s or Master’s degree in Computer Science, Information Systems, or a related field. 4–6 years of experience in data engineering, preferably in a healthcare. Strong SQL and Python skills; experience with Spark is a plus Knowledge in Cloud Architecture. Excellent programming and debugging skills. Ability to write effective and reusable code according to best practices. Experience with HIPAA-compliant data handling and security practices Excellent communication and presentation skills. Good customer interfacing skills. Desired Skills: Experience in Python, PySpark Elastic Search, Kibana Knowledge in SQL Server Ability to deliver without much supervision from lead/managers

Posted 21 hours ago

Apply

1.0 years

1 - 1 Lacs

Gurgaon

Remote

GlassDoor logo

Note: Apply only if you are available for face to face interview and comfortable to Work in office. *NO WORK FROM HOME* Job Location : Spaze ITech Park, Sec 49, Gurgaon Experience : 6 Months to 1 Year The ideal candidate is a highly resourceful and innovative developer with extensive experience in the layout, design and coding of websites specifically in PHP format. You must also possess a strong knowledge of web application development using PHP programming language and MySQL Server databases. KEY ROLES & RESPONSIBILITIES: Work With Development Teams And Product Managers To Ideate Software Solutions. Design Client-Side And Server-Side Architecture Build The Front-End Of Applications Through Appealing Visual Design Build Features And Applications With A Mobile Responsive Design. Write Effective APIs. Test Software To Ensure Responsiveness And Efficiency. Troubleshoot, Debug And Upgrade Software. DESIRED CANDIDATE PROFILE: Minimum 1 year experience as a PHP Developer . Good understanding with hands-on experience on PHP- Laravel & CodeIgniter framework. Excellent relational database skills with MySQL. Should have exposure of any Open-Source E-commerce CMS: WordPress, Shopify, and Magento will be an advantage. Working knowledge of front-end technologies like HTML5, CSS3, and JavaScript/jQuery Understanding of responsive design frameworks such as bootstrap. Must be good in communication Must be a self-motivated, smart working person with good attitude to work in teams and individually Qualifications 1 year of experience in web development and software design Expertise in front-end technologies (HTML, JavaScript, CSS), PHP frameworks, and MySQL databases Job Type: Full-time Pay: ₹10,000.00 - ₹15,000.00 per month Schedule: Day shift Ability to commute/relocate: Gurgaon, Haryana: Reliably commute or planning to relocate before starting work (Required) Experience: PHP: 1 year (Required) Web development: 1 year (Required) Location: Gurgaon, Haryana (Required) Application Deadline: 04/07/2025 Expected Start Date: 07/07/2025

Posted 21 hours ago

Apply

0 years

0 Lacs

Gurgaon

On-site

GlassDoor logo

Ready to shape the future of work? At Genpact, we don’t just adapt to change—we drive it. AI and digital innovation are redefining industries, and we’re leading the charge. Genpact’s AI Gigafactory, our industry-first accelerator, is an example of how we’re scaling advanced technology solutions to help global enterprises work smarter, grow faster, and transform at scale. From large-scale models to agentic AI, our breakthrough solutions tackle companies’ most complex challenges. If you thrive in a fast-moving, tech-driven environment, love solving real-world problems, and want to be part of a team that’s shaping the future, this is your moment. Genpact (NYSE: G) is an advanced technology services and solutions company that delivers lasting value for leading enterprises globally. Through our deep business knowledge, operational excellence, and cutting-edge solutions – we help companies across industries get ahead and stay ahead. Powered by curiosity, courage, and innovation, our teams implement data, technology, and AI to create tomorrow, today. Get to know us at genpact.com and on LinkedIn, X, YouTube, and Facebook. Inviting applications for the role of Assistant Manager, Power BI In this role, we’re looking for a Power BI Specialist who has experience in Power BI and SQL. Responsible for managing all Power BI related client asks and support the team for the overall project goal. Responsibilities Efficiently carry out data preparation/data modelling necessary for the visualization purposes. Collect reporting requirements from various partners, architect the solution/report, understand/analyze the source data and deliver the reports in a timely manner Build and design intuitive and interactive reports and dashboards using Power BI for data driven decisions. Performance Monitoring, fine-tuning and optimization of dashboards. Connect to SQL Servers and other diverse data sources in Power BI. Experience in publishing and sharing the dashboards, scheduling data refreshes. Handle complex visualization or data problems in Dashboards. Display SQL skills, preferably over SQL Server along with Data Warehousing and Business Intelligence concepts Working knowledge of databases like SQL Server, SQL Azure, Oracle etc. A deep understanding of, and ability to use and explain all aspects of, relational database design, multifaceted database design, OLTP, OLAP, critical metrics, Scorecards, and Dashboards Ability to recommend architecture standard methodologies related to ETL, ELT, BI, and the life-cycle of an EDW solution Good to have MS Excel whiz skills - Power Query, Power View, Power Pivot Qualifications we seek in you! Minimum Qualifications Relevant experience in Power BI and SQL Good Communication Skills Experience on any ETL tool preferably Good analytical and problem-solving skills Excellent MS Office skills including MS Excel A flexible, dedicated and solution orientated approach through periods of change and disruption Innovative and always looking for continuous improvement Preferred Qualifications/ Skills Should have worked in a Banking and Finance domain Six sigma certified Exposure to any programming language like Python Why join Genpact? Be a transformation leader – Work at the cutting edge of AI, automation, and digital innovation Make an impact – Drive change for global enterprises and solve business challenges that matter Accelerate your career – Get hands-on experience, mentorship, and continuous learning opportunities Work with the best – Join 140,000+ bold thinkers and problem-solvers who push boundaries every day Thrive in a values-driven culture – Our courage, curiosity, and incisiveness - built on a foundation of integrity and inclusion - allow your ideas to fuel progress Come join the tech shapers and growth makers at Genpact and take your career in the only direction that matters: Up. Let’s build tomorrow together. Genpact is an Equal Opportunity Employer and considers applicants for all positions without regard to race, color, religion or belief, sex, age, national origin, citizenship status, marital status, military/veteran status, genetic information, sexual orientation, gender identity, physical or mental disability or any other characteristic protected by applicable laws. Genpact is committed to creating a dynamic work environment that values respect and integrity, customer focus, and innovation. Furthermore, please do note that Genpact does not charge fees to process job applications and applicants are not required to pay to participate in our hiring process in any other way. Examples of such scams include purchasing a 'starter kit,' paying to apply, or purchasing equipment or training. Job Assistant Manager Primary Location India-Gurugram Schedule Full-time Education Level Bachelor's / Graduation / Equivalent Job Posting Jun 27, 2025, 9:38:41 AM Unposting Date Ongoing Master Skills List Operations Job Category Full Time

Posted 21 hours ago

Apply

0 years

0 Lacs

Gurgaon

On-site

GlassDoor logo

Our Purpose Mastercard powers economies and empowers people in 200+ countries and territories worldwide. Together with our customers, we’re helping build a sustainable economy where everyone can prosper. We support a wide range of digital payments choices, making transactions secure, simple, smart and accessible. Our technology and innovation, partnerships and networks combine to deliver a unique set of products and services that help people, businesses and governments realize their greatest potential. Title and Summary Application DevOps Engineer - SSO10168 A2A Risk Solutions builds products and services powered by payments data. By combining data science technique with an intimate knowledge of payments data we develop solutions that will improve outcomes for people, businesses and economies. Operating globally, we craft bespoke algorithms that help our clients gain an understanding of the behaviour that drives their business. As part of a team you and will be responsible for the day to day operations of the A2A RS product portfolio, building automatic systems to support the monitoring and operation of our services. You will work with a wider team of analysts, data scientists and technologists, designing systems alongside these teams that ensure the efficacy and efficiency of our products. # Role Reporting to the Director of A2A RS Operations, you will: Build automatic monitoring tools for complex, data science based microservice deployments. Ensure that our models are continuing to detect fraud and money laundering as they should, predict the future capably, and generate the right economic insights. Working with our Data Scientists to collect data and train models to enhance our service's accuracy and reduce unnecessary alerts. Working with our Engineers, seek to develop a self-healing and resilient set of microservices, promoting good operational principles during our research, design, and development phases. Engage with current and future technology stacks, in the UK and internationally. Utilise Incident Management processes, automating as much as possible and integrating with Mastercard's ticketing systems. Communicate with both internal stakeholders and collaborators, as well as with technical and business functions within our clients. # All About You Your passion is in building robust, smooth running services to solve real, pressing problems in the financial services sector. You enjoy working in a team, and have an interest in data science and how advanced algorithms may be deployed as product offerings. You are excited by new technology and new approaches to development, and are keen to promote their use in an enterprise setting. You are detail oriented, and don't mind getting your head down writing or reviewing code. You are happy to be on-call, though you aim to contribute to our software in such a way as to limit or even remove the need for anyone to be on-call. You have 1 or more year's worth of demonstrable experience in software development, data science, devops, operations or a related discipline. You are comfortable communicating with a range of stakeholders, including subject matter experts, data scientists, software engineers and enterprise devops and security professionals. You are a confident software developer, and can write (or are happy to learn) Python and Go. You have a firm grasp of traditional data warehousing, can write SQL, and can optimise the use of a large relational database. You have experience with, and are interested in, contemporary approaches to service design, including the use of containers and container orchestration technologies, streaming data platforms, APIs and in-memory/NoSQL stores. Corporate Security Responsibility All activities involving access to Mastercard assets, information, and networks comes with an inherent risk to the organization and, therefore, it is expected that every person working for, or on behalf of, Mastercard is responsible for information security and must: Abide by Mastercard’s security policies and practices; Ensure the confidentiality and integrity of the information being accessed; Report any suspected information security violation or breach, and Complete all periodic mandatory security trainings in accordance with Mastercard’s guidelines.

Posted 21 hours ago

Apply

1.0 - 2.0 years

4 - 5 Lacs

Gurgaon

On-site

GlassDoor logo

Gurgaon 1 1 to 2 years Full Time Role & Responsibilities Design, develop, and maintain robust and scalable Java applications. Collaborate with cross-functional teams to define, design, and ship new features. Write clean, maintainable, and efficient code following best practices. Troubleshoot, debug, and upgrade existing systems. Ensure the performance, quality, and responsiveness of applications. Participate in code reviews to maintain high code quality and share knowledge. Work with relational and NoSQL databases. Create and maintain technical documentation. Stay updated with the latest industry trends and technologies to ensure our solutions are up-to-date. Qualifications: Bachelor’s degree in Computer Science, Information Technology, or a related field. Proven experience as a Java Developer with a strong understanding of Java and Java EE. Familiarity with front-end technologies like HTML5, CSS3, and TypeScript. Experience with Spring Framework (Spring Boot, Spring MVC, Spring Data). Proficiency in working with databases such as MySQL, PostgreSQL, or MongoDB. Strong understanding of RESTful APIs and web services. Familiarity with version control systems like Git. Excellent problem-solving skills and attention to detail. Strong communication and teamwork skills. Ability to work independently and manage multiple projects simultaneously.

Posted 21 hours ago

Apply

3.0 - 5.0 years

0 Lacs

Delhi

On-site

GlassDoor logo

About us Bain & Company is a global management consulting that helps the world’s most ambitious change makers define the future. Across 65 offices in 40 countries, we work alongside our clients as one team with a shared ambition to achieve extraordinary results, outperform the competition and redefine industries. Since our founding in 1973, we have measured our success by the success of our clients, and we proudly maintain the highest level of client advocacy in the industry. In 2004, the firm established its presence in the Indian market by opening the Bain Capability Center (BCC) in New Delhi. The BCC is now known as BCN (Bain Capability Network) with its nodes across various geographies. BCN is an integral and largest unit of (ECD) Expert Client Delivery. ECD plays a critical role as it adds value to Bain's case teams globally by supporting them with analytics and research solutioning across all industries, specific domains for corporate cases, client development, private equity diligence or Bain intellectual property. The BCN comprises of Consulting Services, Knowledge Services and Shared Services. Who you will work with Bain Capability Network (BCN) collaborates with global case teams to address clients' pressing business challenges. Integrated with Bain's diverse capabilities and industry practices, leveraging sector expertise, data, research, and analytics to enhance intellectual property and deliver impactful client solutions. As part of BCN Data Engineering team, you will play a pivotal role in supporting Bain & Company’s client engagements (case work) and the development of innovative, data-driven products. This role requires a blend of technical expertise, problem-solving, and collaboration, as you’ll work closely with Bain consultants, product teams, and global stakeholders to deliver impactful data solutions. What you’ll do Write complex code to develop scalable, flexible, user-friendly applications across a robust technology stack. Evaluate potential technologies for adoption, including open-source frameworks, libraries, and tools. Construct, test, install, and maintain software applications. Contribute to the planning for acceptance testing and implementation of new software, performing supporting activities to ensure that customers have the information and assistance they need for a successful implementation. Develop secure and highly performant services and APIs. Ensure the maintainability and quality of code. About you A Bachelor’s or Master’s degree in Computer Science or related field 3 to 5 years of experience in full stack development Proficiency in back-end technologies such as Node.js, Python (Django/Flask) Experience working with relational and non-relational databases (e.g., MySQL, PostgreSQL, MongoDB) Strong proficiency in JavaScript, TypeScript, or similar programming languages Familiarity with modern development tools like Git, Docker, and CI/CD pipelines. Experience with front-end frameworks (e.g., React.js, Angular, or Vue.js). Knowledge of RESTful APIs and/or GraphQL. Understanding of front-end and back-end architecture and design principles. Basic knowledge of cloud platforms (e.g., AWS, Azure, or Google Cloud) and containerization tools like Docker or Kubernetes. Sound SDLC skills, preferably with experience in an agile environment Excellent English communication skills, with the ability to effectively interface across cross-functional technology teams and the business. What makes us a great place to work We are proud to be consistently recognized as one of the world's best places to work, a champion of diversity and a model of social responsibility. We are currently ranked the #1 consulting firm on Glassdoor’s Best Places to Work list, and we have maintained a spot in the top four on Glassdoor's list for the last 12 years. We believe that diversity, inclusion and collaboration is key to building extraordinary teams. We hire people with exceptional talents, abilities and potential, then create an environment where you can become the best version of yourself and thrive both professionally and personally. We are publicly recognized by external parties such as Fortune, Vault, Mogul, Working Mother, Glassdoor and the Human Rights Campaign for being a great place to work for diversity and inclusion, women, LGBTQ and parents.

Posted 21 hours ago

Apply

5.0 years

0 Lacs

Delhi

On-site

GlassDoor logo

The Role Context: This is an exciting opportunity to join a dynamic and growing organization, working at the forefront of technology trends and developments in social impact sector. Wadhwani Center for Government Digital Transformation (WGDT) works with the government ministries and state departments in India with a mission of “ Enabling digital transformation to enhance the impact of government policy, initiatives and programs ”. We are seeking a highly motivated and detail-oriented individual to join our team as a Data Engineer with experience in the designing, constructing, and maintaining the architecture and infrastructure necessary for data generation, storage and processing and contribute to the successful implementation of digital government policies and programs. You will play a key role in developing, robust, scalable, and efficient systems to manage large volumes of data, make it accessible for analysis and decision-making and driving innovation & optimizing operations across various government ministries and state departments in India. Key Responsibilities: a. Data Architecture Design : Design, develop, and maintain scalable data pipelines and infrastructure for ingesting, processing, storing, and analyzing large volumes of data efficiently. This involves understanding business requirements and translating them into technical solutions. b. Data Integration: Integrate data from various sources such as databases, APIs, streaming platforms, and third-party systems. Should ensure the data is collected reliably and efficiently, maintaining data quality and integrity throughout the process as per the Ministries/government data standards. c. Data Modeling: Design and implement data models to organize and structure data for efficient storage and retrieval. They use techniques such as dimensional modeling, normalization, and denormalization depending on the specific requirements of the project. d. Data Pipeline Development/ ETL (Extract, Transform, Load): Develop data pipeline/ETL processes to extract data from source systems, transform it into the desired format, and load it into the target data systems. This involves writing scripts or using ETL tools or building data pipelines to automate the process and ensure data accuracy and consistency. e. Data Quality and Governance: Implement data quality checks and data governance policies to ensure data accuracy, consistency, and compliance with regulations. Should be able to design and track data lineage, data stewardship, metadata management, building business glossary etc. f. Data lakes or Warehousing: Design and maintain data lakes and data warehouse to store and manage structured data from relational databases, semi-structured data like JSON or XML, and unstructured data such as text documents, images, and videos at any scale. Should be able to integrate with big data processing frameworks such as Apache Hadoop, Apache Spark, and Apache Flink, as well as with machine learning and data visualization tools. g. Data Security : Implement security practices, technologies, and policies designed to protect data from unauthorized access, alteration, or destruction throughout its lifecycle. It should include data access, encryption, data masking and anonymization, data loss prevention, compliance, and regulatory requirements such as DPDP, GDPR, etc. h. Database Management: Administer and optimize databases, both relational and NoSQL, to manage large volumes of data effectively. i. Data Migration: Plan and execute data migration projects to transfer data between systems while ensuring data consistency and minimal downtime. a. Performance Optimization : Optimize data pipelines and queries for performance and scalability. Identify and resolve bottlenecks, tune database configurations, and implement caching and indexing strategies to improve data processing speed and efficiency. b. Collaboration: Collaborate with data scientists, analysts, and other stakeholders to understand their data requirements and provide them with access to the necessary data resources. They also work closely with IT operations teams to deploy and maintain data infrastructure in production environments. c. Documentation and Reporting: Document their work including data models, data pipelines/ETL processes, and system configurations. Create documentation and provide training to other team members to ensure the sustainability and maintainability of data systems. d. Continuous Learning: Stay updated with the latest technologies and trends in data engineering and related fields. Should participate in training programs, attend conferences, and engage with the data engineering community to enhance their skills and knowledge. Desired Skills/ Competencies Education: A Bachelor's or Master's degree in Computer Science, Software Engineering, Data Science, or equivalent with at least 5 years of experience. Database Management: Strong expertise in working with databases, such as SQL databases (e.g., MySQL, PostgreSQL) and NoSQL databases (e.g., MongoDB, Cassandra). Big Data Technologies: Familiarity with big data technologies, such as Apache Hadoop, Spark, and related ecosystem components, for processing and analyzing large-scale datasets. ETL Tools: Experience with ETL tools (e.g., Apache NiFi, Talend, Apache Airflow, Talend Open Studio, Pentaho, Infosphere) for designing and orchestrating data workflows. Data Modeling and Warehousing: Knowledge of data modeling techniques and experience with data warehousing solutions (e.g., Amazon Redshift, Google BigQuery, Snowflake). Data Governance and Security: Understanding of data governance principles and best practices for ensuring data quality and security. Cloud Computing: Experience with cloud platforms (e.g., AWS, Azure, Google Cloud) and their data services for scalable and cost-effective data storage and processing. Streaming Data Processing: Familiarity with real-time data processing frameworks (e.g., Apache Kafka, Apache Flink) for handling streaming data. KPIs: Data Pipeline Efficiency: Measure the efficiency of data pipelines in terms of data processing time, throughput, and resource utilization. KPIs could include average time to process data, data ingestion rates, and pipeline latency. Data Quality Metrics: Track data quality metrics such as completeness, accuracy, consistency, and timeliness of data. KPIs could include data error rates, missing values, data duplication rates, and data validation failures. System Uptime and Availability: Monitor the uptime and availability of data infrastructure, including databases, data warehouses, and data processing systems. KPIs could include system uptime percentage, mean time between failures (MTBF), and mean time to repair (MTTR). Data Storage Efficiency: Measure the efficiency of data storage systems in terms of storage utilization, data compression rates, and data retention policies. KPIs could include storage utilization rates, data compression ratios, and data storage costs per unit. Data Security and Compliance: Track adherence to data security policies and regulatory compliance requirements such as DPDP, GDPR, HIPAA, or PCI DSS. KPIs could include security incident rates, data access permissions, and compliance audit findings. Data Processing Performance: Monitor the performance of data processing tasks such as ETL (Extract, Transform, Load) processes, data transformations, and data aggregations. KPIs could include data processing time, CPU usage, and memory consumption. Scalability and Performance Tuning: Measure the scalability and performance of data systems under varying workloads and data volumes. KPIs could include scalability benchmarks, system response times under load, and performance improvements achieved through tuning. Resource Utilization and Cost Optimization: Track resource utilization and costs associated with data infrastructure, including compute resources, storage, and network bandwidth. KPIs could include cost per data unit processed, cost per query, and cost savings achieved through optimization. Incident Response and Resolution: Monitor the response time and resolution time for data-related incidents and issues. KPIs could include incident response time, time to diagnose and resolve issues, and customer satisfaction ratings for support services. Documentation and Knowledge Sharing : Measure the quality and completeness of documentation for data infrastructure, data pipelines, and data processes. KPIs could include documentation coverage, documentation update frequency, and knowledge sharing activities such as internal training sessions or knowledge base contributions. Years of experience of the current role holder New Position Ideal years of experience 3 – 5 years Career progression for this role CTO WGDT (Head of Incubation Centre) ******************************************************************************* Wadhwani Corporate Profile: (Click on this link) Our Culture: WF is a global not-for-profit, and works like a start-up, in a fast-moving, dynamic pace where change is the only constant and flexibility is the key to success. Three mantras that we practice across job roles, levels, functions, programs and initiatives, are Quality, Speed, Scale, in that order. We are an ambitious and inclusive organization, where everyone is encouraged to contribute and ideate. We are intensely and insanely focused on driving excellence in everything we do. We want individuals with the drive for excellence, and passion to do whatever it takes to deliver world class outcomes to our beneficiaries. We set our own standards often more rigorous than what our beneficiaries demand, and we want individuals who love it this way. We have a creative and highly energetic environment – one in which we look to each other to innovate new solutions not only for our beneficiaries but for ourselves too. Open to collaborate with a borderless mentality, often going beyond the hierarchy and siloed definitions of functional KRAs, are the individuals who will thrive in our environment. This is a workplace where expertise is shared with colleagues around the globe. Individuals uncomfortable with change, constant innovation, and short learning cycles and those looking for stability and orderly working days may not find WF to be the right place for them. Finally, we want individuals who want to do greater good for the society leveraging their area of expertise, skills and experience. The foundation is an equal opportunity firm with no bias towards gender, race, colour, ethnicity, country, language, age and any other dimension that comes in the way of progress. Join us and be a part of us! Bachelors in Technology / Masters in Technology

Posted 21 hours ago

Apply

1.0 - 2.0 years

0 Lacs

Delhi

On-site

GlassDoor logo

Who We Are Bain & Company is a global management consulting that helps the world’s most ambitious change-makers define the future. Across 65 offices in 40 countries, we work alongside our clients as one team with a shared ambition to achieve extraordinary results, outperform the competition and redefine industries. Since our founding in 1973, we have measured our success by the success of our clients, and we proudly maintain the highest level of client advocacy in the industry. Who You’ll work with BCN Labs is a Center of Excellence (CoE) functioning akin to a small R&D boutique startup within the Bain ecosystem, delivering end-to-end data driven client deployable solutions across a wide variety of sectors and industries. We work directly with other CoEs and Practices within Bain as part of the Expert Client Delivery system and interface with teams across the globe. We are first and foremost business thought partners working on intelligent ways of using analytical techniques and algorithms across the spectrum of disciplines that can enable building world-class solutions. Our goal is to build a disruptive high-impact business-enabled end-to-end analytical solutions delivery system across all verticals of Bain. What You Will Do We’re seeking an Associate – Back-End/Full-Stack Developer to join our team and build scalable, secure, and efficient backend services that power client-facing analytical applications and internal platforms. You’ll work closely with Front-end developers, data scientists, and business leads to implement logic, manage data flow and ensure robust performance of web applications. As an Associate – Back-End/Full-Stack Developer, you will: Build Backend Services : Develop modular, scalable server-side applications and services using Python-based frameworks such as FastAPI or Django Design & Integrate APIs : Collaborate with front-end developers and data scientists to design clean, well-documented APIs that meet evolving business needs Manage Data storage/Interactions : Write efficient database queries, optimize storage and retrieval operations, and ensure data integrity across services Collaborate across teams : Work closely with front-end developers, and data scientists to integrate business logic and ensure smooth data pipelines Uphold Code Quality : Participate in code reviews, adhere to best practices, and write maintainable and testable code Drive Improvement: Demonstrate Strong problem-solving skills, team collaboration, and a proactive attitude toward learning and improvement About You Education & Experience: Bachelor’s or Master’s degree in Computer Science, Information Technology, Engineering, or a related field. 1–2 years of real-world experience in backend development (or full-stack development) and service integration, with exposure to data-driven applications or platforms. You will fit into our team-oriented structure with a college/hostel-style way of working, having the comfort of reaching out to anyone for support that can enable our clients better. Core Technical Skills: Strong proficiency in Python, with hands-on experience building backend services using: FastAPI for high-performance, asynchronous APIs, and/or Django Rest Framework (DRF) for robust, secure and scalable APIs Experience designing and implementing RESTful APIs, including authentication, versioning, and documentation (e.g., OpenAPI/Swagger) Proficient in working with Relational Databases (e.g., PostgreSQL, MySQL, MS SQL Server) Exposure to NoSQL databases such as MongoDB or Redis (for caching) is a plus Skilled in implementing business logic, data parsing, validation, and serialization using Pydantic (FastAPI) or DRF serializers Exposure to deploying Python applications on cloud services like AWS (EC2, Lambda, S3) or Azure Familiarity with Docker, Git, and basic CI/CD practices for backend deployments. (Good-to-have) Basic understanding of React.js or Vue.js to better collaborate with front-end teams. (Good-to-have) Experience or exposure to integrating FastAPI/DRF backends with front-end clients via REST APIs. (Good-to-have) Understanding of how front-end apps consume and render API responses (e.g., JSON) and how to optimize for usability. Development Tools & Practices: Experience with version control using Git in collaborative environments (e.g., GitHub) Familiarity with sprint-based work planning and code reviews Exposure to testing frameworks (e.g., Pytest, Unittest etc.) and debugging practices What makes us a great place to work We are proud to be consistently recognized as one of the world's best places to work, a champion of diversity and a model of social responsibility. We are currently ranked the #1 consulting firm on Glassdoor’s Best Places to Work list, and we have maintained a spot in the top four on Glassdoor's list for the last 12 years. We believe that diversity, inclusion and collaboration is key to building extraordinary teams. We hire people with exceptional talents, abilities and potential, then create an environment where you can become the best version of yourself and thrive both professionally and personally. We are publicly recognized by external parties such as Fortune, Vault, Mogul, Working Mother, Glassdoor and the Human Rights Campaign for being a great place to work for diversity and inclusion, women, LGBTQ and parents.

Posted 21 hours ago

Apply

2.0 years

2 - 4 Lacs

Pitampura

On-site

GlassDoor logo

Position: Laravel Developer Location: Kohat Enclave Experience: 2+ years Job Summary: We are looking for a Laravel Developer with experience in building scalable web applications and working on real-time features using technologies like WebSockets. The ideal candidate must have strong expertise in PHP, Laravel, MySQL, and must be capable of integrating APIs, managing backend logic, and working collaboratively with frontend teams. Key Responsibilities: Design, develop, and maintain robust Laravel-based web applications. Implement real-time functionalities using Laravel Echo, WebSockets, or similar technologies. Work with RESTful APIs and integrate third-party services. Design and optimize relational databases (MySQL). Handle authentication, authorization, and middleware development. Write clean, secure, and reusable code with proper documentation. Troubleshoot, debug, and upgrade existing applications. Collaborate with UI/UX teams to ensure high-quality user experience. Use version control systems like Git for code management. Optimize application performance and handle background tasks/queues. Required Skills: Proficiency in Laravel (latest versions) and PHP. Strong experience with MySQL and database schema design. Good understanding of MVC architecture and OOP principles. Experience with real-time technologies like WebSockets, Laravel Echo, and Pusher/socket.io. Familiarity with frontend integration using Blade, Vue.js, or React (preferred). Working knowledge of Laravel queues, jobs, and events. Proficient with Git, Postman, and debugging tools. Understanding of security best practices (CSRF, XSS, SQL Injection). Experience with deployment on shared or cloud servers (e.g., AWS, DigitalOcean). Interested candidates can forward their resume at hr@axepertexhibits.com or call at 9211659314. Job Type: Full-time Pay: ₹20,000.00 - ₹35,000.00 per month Schedule: Day shift Supplemental Pay: Overtime pay Work Location: In person

Posted 21 hours ago

Apply

3.0 years

0 - 0 Lacs

Hyderābād

On-site

GlassDoor logo

About the Role: We are seeking a skilled and experienced Software Engineer III Frontend engineer to join our team and contribute to the development and maintenance of our React-based web applications. You will play a key role in building and improving our user interfaces, working closely with designers and backend engineers to deliver high-quality, performant, and scalable solutions. This position requires a strong understanding of React, best practices in frontend development, and a commitment to producing clean, well-documented code. Responsibilities: Frontend Development: Develop and maintain React-based web applications using our established design system. Implement new features and enhancements based on designs and requirements. Performance Optimization: Implement efficient and scalable solutions, prioritizing performance and site speed. Proactively identify and resolve performance bottlenecks. API Integration: Integrate with REST APIs using AJAX, Node and other relevant technologies. Handle asynchronous operations and manage data efficiently. Code Quality: Write clean, well-documented, and testable code that adheres to our coding standards and best practices. Conduct thorough code reviews and provide constructive feedback to peers. Collaboration: Collaborate effectively with designers, backend engineers, and product managers to understand requirements, provide technical input, and ensure a seamless integration of frontend and backend components. Design System Contribution: Contribute to the continuous improvement of our design system by identifying areas for improvement, proposing solutions, and implementing enhancements. Testing & Automation: Implement automated testing strategies using frameworks such as Jest and React Testing Library to ensure code quality and prevent regressions. Deployment & Infrastructure: Utilize and manage deployments on AWS (or similar cloud infrastructure). Understand CI/CD pipelines and contribute to their improvement. Knowledge Sharing: Actively participate in code reviews and knowledge sharing within the team. Mentor junior engineers and share expertise. Qualifications: Bachelor’s degree in Computer Science or related field, or equivalent experience. 3–5 years of professional experience in front end with strong expertise in - React, JavaScript (ES6+), Redux, Node, HTML/CSS, responsive design. Proficient in Go or Java, React, Node. Experience with modern JavaScript frameworks/libraries (e.g., Redux, Zustand, Context API). Strong understanding of JavaScript, HTML, CSS, and responsive design principles. Experience designing and developing RESTful APIs. Experience with relational and/or NoSQL databases. Proven experience working with design systems; understanding of design system principles and best practices. Experience with testing frameworks (e.g., Jest, React Testing Library, Cypress). Experience with cloud platforms (AWS preferred). Familiarity with CI/CD pipelines is a significant advantage. Experience with version control systems (e.g., Git). Good problem-solving and debugging skills. Experience with Agile development methodologies is a plus. Fanatics Commerce is a leading designer, manufacturer, and seller of licensed fan gear, jerseys, lifestyle and streetwear products, headwear, and hardgoods. It operates a vertically-integrated platform of digital and physical capabilities for leading sports leagues, teams, colleges, and associations globally – as well as its flagship site, www.fanatics.com. Fanatics Commerce has a broad range of online, sports venue, and vertical apparel partnerships worldwide, including comprehensive partnerships with leading leagues, teams, colleges, and sports organizations across the world—including the NFL, NBA, MLB, NHL, MLS, Formula 1, and Australian Football League (AFL); the Dallas Cowboys, Golden State Warriors, Paris Saint-Germain, Manchester United, Chelsea FC, and Tokyo Giants; the University of Notre Dame, University of Alabama, and University of Texas; the International Olympic Committee (IOC), England Rugby, and the Union of European Football Associations (UEFA). At Fanatics Commerce, we infuse our BOLD Leadership Principles in everything we do: Build Championship Teams Obsessed with Fans Limitless Entrepreneurial Spirit Determined and Relentless Mindset Fanatics is building a leading global digital sports platform. We ignite the passions of global sports fans and maximize the presence and reach for our hundreds of sports partners globally by offering products and services across Fanatics Commerce, Fanatics Collectibles, and Fanatics Betting & Gaming, allowing sports fans to Buy, Collect, and Bet. Through the Fanatics platform, sports fans can buy licensed fan gear, jerseys, lifestyle and streetwear products, headwear, and hardgoods; collect physical and digital trading cards, sports memorabilia, and other digital assets; and bet as the company builds its Sportsbook and iGaming platform. Fanatics has an established database of over 100 million global sports fans; a global partner network with approximately 900 sports properties, including major national and international professional sports leagues, players associations, teams, colleges, college conferences and retail partners, 2,500 athletes and celebrities, and 200 exclusive athletes; and over 2,000 retail locations, including its Lids retail stores. Our more than 22,000 employees are committed to relentlessly enhancing the fan experience and delighting sports fans globally.

Posted 21 hours ago

Apply

1.0 - 3.0 years

7 - 8 Lacs

Hyderābād

On-site

GlassDoor logo

India - Hyderabad JOB ID: R-219085 ADDITIONAL LOCATIONS: India - Hyderabad WORK LOCATION TYPE: On Site DATE POSTED: Jun. 27, 2025 CATEGORY: Information Systems Join Amgen’s Mission of Serving Patients At Amgen, if you feel like you’re part of something bigger, it’s because you are. Our shared mission—to serve patients living with serious illnesses—drives all that we do. Since 1980, we’ve helped pioneer the world of biotech in our fight against the world’s toughest diseases. With our focus on four therapeutic areas –Oncology, Inflammation, General Medicine, and Rare Disease– we reach millions of patients each year. As a member of the Amgen team, you’ll help make a lasting impact on the lives of patients as we research, manufacture, and deliver innovative medicines to help people live longer, fuller happier lives. Our award-winning culture is collaborative, innovative, and science based. If you have a passion for challenges and the opportunities that lay within them, you’ll thrive as part of the Amgen team. Join us and transform the lives of patients while transforming your career. What you will do Let’s do this. Let’s change the world. In this vital role you will be responsible for designing, building, maintaining, analyzing, and interpreting data to provide actionable insights that drive business decisions. This role involves working with large datasets, developing reports, supporting and executing data initiativesand, visualizing data to ensure data is accessible, reliable, and efficiently managed. The ideal candidate has strong technical skills, experience with big data technologies, and a deep understanding of data architecture and ETL processes Design, develop, and maintain data solutions for data generation, collection, and processing Be a key team member that assists in design and development of the data pipeline Create data pipelines and ensure data quality by implementing ETL processes to migrate and deploy data across systems Contribute to the design, development, and implementation of data pipelines, ETL/ELT processes, and data integration solutions Take ownership of data pipeline projects from inception to deployment, manage scope, timelines, and risks Develop and maintain data models, data dictionaries, and other documentation to ensure data accuracy and consistency Implement data security and privacy measures to protect sensitive data Leverage cloud platforms (AWS preferred) to build scalable and efficient data solutions Collaborate and communicate effectively with product teams Collaborate with Data Architects, Business SMEs, and Data Scientists to design and develop end-to-end data pipelines to meet fast paced business needs across geographic regions Identify and resolve complex data-related challenges Adhere to best practices for coding, testing, and designing reusable code/component Explore new tools and technologies that will help to improve ETL platform performance Participate in sprint planning meetings and provide estimations on technical implementation Basic Qualifications : Master’s degree and 1 to 3 years of Computer Science, IT or related field experience OR Bachelor’s degree and 3 to 5 years of Computer Science, IT or related field experience OR Diploma and 7 to 9 years of Computer Science, IT or related field experience Must have Skills : Hands on experience with big data technologies and platforms, such as Databricks, Apache Spark (PySpark, SparkSQL), workflow orchestration, performance tuning on big data processing Proficiency in data analysis tools (eg. SQL) Proficient in SQL for extracting, transforming, and analyzing complex datasets from relational data stores Experience with ETL tools such as Apache Spark, and various Python packages related to data processing, machine learning model development Strong understanding of data modeling, data warehousing, and data integration concepts Proven ability to optimize query performance on big data platforms Preferred Qualifications: Experience with Software engineering best-practices, including but not limited to version control, infrastructure-as-code, CI/CD, and automated testing Knowledge of Python/R, Databricks, cloud data platforms Strong understanding of data governance frameworks, tools, and best practices. Knowledge of data protection regulations and compliance requirements (e.g., GDPR, CCPA) Professional Certifications: AWS Certified Data Engineer preferred Databricks Certificate preferred Soft Skills : Excellent critical-thinking and problem-solving skills Strong communication and collaboration skills Demonstrated awareness of how to function in a team setting Demonstrated presentation skills What you can expect of us As we work to develop treatments that take care of others, we also work to care for your professional and personal growth and well-being. From our competitive benefits to our collaborative culture, we’ll support your journey every step of the way. In addition to the base salary, Amgen offers competitive and comprehensive Total Rewards Plans that are aligned with local industry standards. for a career that defies imagination Objects in your future are closer than they appear. Join us. careers.amgen.com As an organization dedicated to improving the quality of life for people around the world, Amgen fosters an inclusive environment of diverse, ethical, committed and highly accomplished people who respect each other and live the Amgen values to continue advancing science to serve patients. Together, we compete in the fight against serious disease. Amgen is an Equal Opportunity employer and will consider all qualified applicants for employment without regard to race, color, religion, sex, sexual orientation, gender identity, national origin, protected veteran status, disability status, or any other basis protected by applicable law. We will ensure that individuals with disabilities are provided reasonable accommodation to participate in the job application or interview process, to perform essential job functions, and to receive other benefits and privileges of employment. Please contact us to request accommodation.

Posted 21 hours ago

Apply

3.0 - 4.0 years

1 - 8 Lacs

Hyderābād

On-site

GlassDoor logo

Summary: Contribute to the enhancement and maintenance of one or more Charles River IMS modules or components as a senior member of an agile scrum team. Provide engineering troubleshooting assistance to customer support teams and other development teams within Charles River. Responsibilities: Work under minimal supervision to analyze, design, develop, test, and debug small to medium software enhancements and solutions within Charles River’s business and technical problem domains Develop, test, debug, and implement software programs, applications and projects using Java, SQL, JavaScript, or other software related engineering languages Provide thoughtful insight and suggestions in code6 reviews Write unit and automation tests to ensure a high quality end product Conduct manual tests to ensure a high quality end product Actively participate in the agile software development process by adhering to the CRD scrum methodology including attending all daily standups, sprint planning, backlog grooming, and retrospectives Qualifications: Education: B.S. degree (or foreign education equivalent) in Computer Science, Engineering, Mathematics, and Physics or other technical course of study required. MS degree strongly preferred. Experience: 3 to 4 years of progressively responsible professional software engineering experience preferably in a financial services product delivery setting Minimum 3 years of experience in Java is required. 2 years of experience in developing microservices using Springboot 1 to 2 years of experience on Azure, AWS or any other cloud systems is required. 2 years of experience in writing test cases using Junit is strongly desired Good to have experience on Javascript, Angular or React JS Good to have experience on messaging services like Azure service bus or Rabbit MQ 1 to 2 years of experience in financial services developing solutions for Portfolio Management is desired Working experience on Docker is added advantage Experience with object-oriented programming, compiler or interpreter technologies, embedded systems, operating systems, relational databases (RDBMS), scripting and new/advanced programming languages Able to work on small to medium projects with little to no supervision and on more complex tasks with moderate oversight Good written and verbal communication skills Able to work well with peers in a collaborative team environment A minimum of 2 years working with an Agile development methodology strongly desired Supervisory Responsibility: Individual Contributor Team Lead Manager of Managers Travel: May be required on a limited basis.

Posted 21 hours ago

Apply

15.0 years

0 Lacs

Hyderābād

On-site

GlassDoor logo

Project Role : User Experience Engineer Project Role Description : Accountable for prototype work and other software engineering solutions that create an optimized user experience. Translate design concepts to prototype solutions as quickly and tangibly as possible, with a balanced understanding of technical feasibility implications and design intent. Must have skills : React.js Good to have skills : DevOps, Angular Minimum 3 year(s) of experience is required Educational Qualification : 15 years full time education Summary: As a User Experience Engineer, you will be responsible for creating innovative software engineering solutions that enhance user experiences. Your typical day will involve collaborating with design teams to translate concepts into functional prototypes, ensuring that the final product aligns with both user needs and technical capabilities. You will engage in iterative design processes, testing and refining prototypes to achieve optimal usability and performance, while also considering the technical implications of design choices. Your role will require a balance of creativity and technical insight, allowing you to contribute significantly to the development of user-centered applications. Roles & Responsibilities: - Expected to be an SME. - Collaborate and manage the team to perform. - Responsible for team decisions. - Engage with multiple teams and contribute on key decisions. - Expected to provide solutions to problems that apply across multiple teams. - Experience as a DevOps Engineer or similar software engineering role ( 5 years) - Mandatory skills: Web API, C#, SQL Server (complex and relational databases), .NET - Desired skills: Azure DevOps, GIT, UI framework Angular and Kendo UI, Azure serverless computing , Cloud computing security technologies, Test driven development (DB tests, Unit tests) - Knowledge of information security standards, such as OWASP - Knowledge of usability and accessibility standards, such as WCAG Professional & Technical Skills: - Must To Have Skills: Proficiency in React.js, Web API, C#, SQL Server (complex and relational databases), .NET - Good To Have Skills: Azure DevOps, GIT, UI framework Angular and Kendo UI, Azure serverless computing , Cloud computing security technologies, Test driven development (DB tests, Unit tests) - Strong understanding of user-centered design principles and methodologies. - Experience with front-end development technologies and frameworks. - Ability to create wireframes, prototypes, and user flows to effectively communicate design ideas. Additional Information: - The candidate should have minimum 3 years of experience in React.js. - This position is based at our Hyderabad office. - A 15 years full time education is required. 15 years full time education

Posted 21 hours ago

Apply

4.0 years

6 - 10 Lacs

Hyderābād

On-site

GlassDoor logo

Description At Vitech, we believe in the power of technology to simplify complex business processes. Our mission is to bring better software solutions to market, addressing the intricacies of the insurance and retirement industries. We combine deep domain expertise with the latest technological advancements to deliver innovative, user-centric solutions that future-proof and empower our clients to thrive in an ever-changing landscape. With over 1,600 talented professionals on our team, our innovative solutions are recognized by industry leaders like Gartner, Celent, Aite-Novarica, and ISG. We offer a competitive compensation package along with comprehensive benefits that support your health, well-being, and financial security. Senior Site Reliability Engineer (SRE) Location: Hyderabad (Hybrid Role) Senior Site Reliability Engineer (SRE) – Join Our Global Engineering Team At Vitech we believe that excellence in production systems starts with engineering-driven solutions to operational challenges. Our Site Reliability Engineering (SRE) team is at the heart of ensuring seamless performance for our clients, preventing potential outages, and proactively identifying and resolving issues before they arise. Our SRE team is a diverse group of talented engineers across India, the US, and Canada. We have T-shaped expertise spanning application development, database management, networking, and system administration across both on-premise environments and AWS cloud. Together, we support mission-critical client environments and drive automation to reduce manual toil, freeing our team to focus on innovation. About the Role: Senior SRE As a SRE, you’ll be a key player in revolutionizing how we operate production systems for single and multi-tenant environments. You'll support SRE initiatives, support production, and drive infrastructure automation. Working in an Agile team environment, you’ll have the opportunity to explore and implement the latest technologies, engage in on-call duties, and contribute to continuous learning as part of an ever-evolving tech landscape. If you’re passionate about scalability, reliability, security, and automation of business-critical infrastructure, this role is for you. What you will do: Own and manage our AWS cloud-based technology stack, using native AWS services and top-tier SRE tools to support multiple client environments with Java-based applications and microservices architecture. Design, deploy, and manage AWS Aurora PostgreSQL clusters for high availability and scalability. Optimize SQL queries, indexes, and database parameters for performance tuning. Automate database operations using Terraform, Ansible, AWS Lambda, and AWS CLI. Manage Aurora’s read replicas, auto-scaling, and failover mechanisms. Enhance infrastructure as code (IAC) patterns using technologies like Terraform, CloudFormation, Ansible, Python, and SDK. Collaborate with DevOps teams to integrate Aurora with CI/CD pipelines. Provide full-stack support, as per assigned schedule, on applications across technologies such as Oracle WebLogic, AWS Aurora PostgreSQL, Oracle Database, Apache Tomcat, AWS Elastic Beanstalk, Docker/ECS, EC2, S3, etc., Troubleshoot database incidents, perform root cause analysis, and implement preventive measures. Document database architecture, configurations, and operational procedures. Ensure high availability, scalability, and performance of PostgreSQL databases on AWS Aurora. Monitor database health, troubleshoot issues, and perform root cause analysis for incidents. Embrace SRE principles such as Chaos Engineering, Reliability, Reducing Toil, etc., What We're Looking For: Proven hands-on experience as an SRE for critical, client-facing applications, with the ability to dive deep into daily SRE tasks, manage incidents, and oversee operational tools. 4+ years of experience in managing relational databases (Oracle, and/or PostgreSQL) in both cloud and on-prem environments, including SRE tasks like backup/restore, Performance issues and replication (primary skill required for this role) 3+ years of experience hosting enterprise applications in AWS (EC2, EBS, ECS/EKS, Elastic Beanstalk, RDS, CloudWatch). Strong understanding of AWS networking concepts (VPC, VPN/DX/Endpoints, Route53, CloudFront, Load Balancers, WAF). Familiarity with tools like pgAdmin, psql, or other database management utilities. Automate routine database maintenance tasks (e.g., vacuuming, reindexing, patching). Knowledge of backup and recovery strategies (e.g., pg_dump, PITR). Automate routine database maintenance tasks (e.g., vacuuming, reindexing, patching). Set up and maintain monitoring and alerting systems for database performance and availability (e.g., CloudWatch, Honeycomb, New Relic, Dynatrace etc.,). Work closely with development teams to optimize database schemas, queries, and application performance. Provide database support during application deployments and migrations. Hands-on experience with web/application layers (Oracle WebLogic, Apache Tomcat, AWS Elastic Beanstalk, SSL certificates, S3 buckets). Experience with containerized applications (Docker, Kubernetes, ECS). Leverage AWS Aurora features (e.g., read replicas, auto-scaling, multi-region deployments) to enhance database performance and reliability. Automation experience with Infrastructure as Code (Terraform, CloudFormation, Python, Jenkins, GitHub/Actions). Knowledge of multi-region Aurora Global Databases for disaster recovery. Scripting experience in Python, Bash, Java, JavaScript, Node.js. Excellent written/verbal communication, critical thinking. Willingness to work in shifts and assist your team to resolve issues efficiently. Join Us at Vitech! At Vitech, we believe in empowering our teams to drive innovation through technology. If you thrive in a dynamic environment and are eager to drive innovation in SRE practices, we want to hear from you! You’ll be part of a forward-thinking team that values collaboration, innovation, and continuous improvement. We provide a supportive and inclusive environment where you can grow as a leader while helping shape the future of our organization. About Vitech At Vitech, Your Expertise Drives Transformative Change in Fintech For over 30 years, Vitech has empowered leading players in insurance, pensions, and retirement with cutting-edge, cloud-native solutions and implementation services. Our mission is clear: harness technology to simplify complex business processes and deliver intuitive, user-centric software that propels our clients' success. At Vitech, you won’t just fill a position; you’ll join a purpose-driven team on a mission that truly matters. Innovation is at our core, and we empower you to push boundaries, unleash creativity, and contribute to projects that make a real difference in the financial sector. Though our name may be new to you, our impact is recognized by industry leaders like Gartner, Celent, Aite-Novarica, ISG, and Everest Group. Why Choose Us? With Vitech, you won’t just fill a position; you’ll be part of a purpose-driven mission that truly matters. We pursue innovation relentlessly, empowering you to unleash your creativity and push boundaries. Here, you’ll work on cutting-edge projects that allow you to make a real difference—driving change and improving lives. We value strong partnerships that foster mutual growth. You will collaborate with talented colleagues and industry leaders, building trust and forming relationships that drive success. Your insights and expertise will be essential as you become an integral part of our collaborative community, amplifying not just your career but the impact we have on our clients. We are committed to a focus on solutions that makes a tangible difference. In your role, you will embrace the challenge of understanding the unique pain points faced by our clients. Your analytical skills and proactive mindset will enable you to develop innovative solutions that not only meet immediate needs but also create lasting value. Here, your contributions will directly influence our success and propel your professional growth. At Vitech, we foster an actively collaborative culture where open communication and teamwork are paramount. With our “yes and” philosophy, your ideas will be welcomed and nurtured, allowing you to contribute your unique insights and perspectives. This environment will enhance your ability to work effectively within diverse teams, empowering you to lead initiatives that result in exceptional outcomes. We believe in remaining curious and promoting continuous learning. You will have access to extensive resources and professional development opportunities that will expand your knowledge and keep you at the forefront of the industry. Your curiosity will fuel innovation, and we are committed to supporting your growth every step of the way. In addition to a rewarding work environment, we offer a competitive compensation package with comprehensive benefits designed to support your health, well-being, and financial security. At Vitech, you’ll find a workplace that challenges and empowers you to make meaningful contributions, develop your skills, and grow with a team that’s dedicated to excellence. If you’re ready to make a real impact in fintech and join a forward-thinking organization, explore the incredible opportunities that await at Vitech. Apply today and be part of our journey to drive transformative change!

Posted 21 hours ago

Apply

5.0 years

6 - 9 Lacs

Mohali

On-site

GlassDoor logo

Experience: 5+ years Job Location: Mohali, (nearby Candidates Only) Job Type: Full Time (Work from Office) 5 Day working | Job Overview – Senior Dot Net Developer Skillset Required: .NET Framework, .NET Core, C#, MVC, Web API, Dapper, Entity framework, MS SQL Server, GIT Angular 12+ (Front-End), HTML, CSS, JavaScript, jQuery Microsoft Azure Technologies (Resource Groups, Application Services, Azure SQL, Azure Web Jobs, Azure Automation & Runbooks, Azure Power Shell etc-2) Experience in MVC designing and coding Key Responsibilities of candidate will be: Be a key part of the full product development life cycle of software applications Ability to prototype solution quickly and analyze / compare multiple solutions and products based on requirements Always concentrated on Performance, Scalability and Security. Hands-on experience in building REST based solutions conforming to HTTP standards and knowledge of working of TLS / SSL. Proficiency with technologies like C#, ASP.NET, ASP.NET MVC, ASP.NET Core, JavaScript, Web API, and REST Web Services. Working knowledge of various Client-Side Frameworks –jQuery (Kendo UI, AngularJS, ReactJS will be additional knowledge) Experience with Cloud services offered by MS Azure Understanding and analyzing the non-functional requirements for the system and how does the architecture reflect them In depth knowledge of encoding and encryption techniques and their usage. Extensive knowledge of different industry standards like OAuth2.0, SAML2.0, OpenID Connect, Open API, SOAP, HTTP, HTTPS Proficiency with the Development tools – Visual Studio Proficiency with the Application Servers – IIS, Apache (Considering .NET Core framework is Platform independent). Experience in designing and implementing applications utilizing databases – MySQL, MS SQL Server, Oracle, AWS Aurora, Azure database for MySQL and non-relational databases Strong Problem Solving and analytical skills Experience on Micro-Services based architecture is a plus Job Types: Full-time, Permanent Pay: ₹50,000.00 - ₹80,000.00 per month Benefits: Paid sick time Paid time off Location Type: In-person Schedule: Day shift Fixed shift Monday to Friday Work Location: In person Speak with the employer +91 8091980889

Posted 21 hours ago

Apply

3.0 years

6 - 6 Lacs

Bengaluru

On-site

GlassDoor logo

- 3+ years of data engineering experience - Experience with data modeling, warehousing and building ETL pipelines - Experience with SQL - Bachelor's degree Are you passionate about data and code? Does the prospect of dealing with mission-critical data excite you? Do you want to build data engineering solutions that process a broad range of business and customer data? Do you want to continuously improve the systems that enable annual worldwide revenue of hundreds of billions of dollars? If so, then the eCommerce Services (eCS) team is for you! In eCommerce Services (eCS), we build systems that span the full range of eCommerce functionality, from Privacy, Identity, Purchase Experience and Ordering to Shipping, Tax and Financial integration. eCommerce Services manages several aspects of the customer life cycle, starting from account creation and sign in, to placing items in the shopping cart, proceeding through checkout, order processing, managing order history and post-fulfillment actions such as refunds and tax invoices. eCS services determine sales tax and shipping charges, and we ensure the privacy of our customers. Our mission is to provide a commerce foundation that accelerates business innovation and delivers a secure, available, performant, and reliable shopping experience to Amazon’s customers. The goal of the eCS Data Engineering and Analytics team is to provide high quality, on-time reports to Amazon business teams, enabling them to expand globally at scale. Our team has a direct impact on retail CX, a key component that runs our Amazon fly wheel. As a Data Engineer, you will own the architecture of DW solutions for the Enterprise using multiple platforms. You would have the opportunity to lead the design, creation and management of extremely large datasets working backwards from business use cases. You will use your strong business and communication skills to be able to work with business analysts and engineers to determine how best to design the data warehouse for reporting and analytics. You will be responsible for designing and implementing scalable ETL processes in the data warehouse platform to support the rapidly growing and dynamic business demand for data and use it to deliver the data as service which will have an immediate influence on day-to-day decision making. Key job responsibilities - Develop data products, infrastructure and data pipelines leveraging AWS services (such as Redshift, Kinesis, EMR, Lambda etc.) and internal BDT tools (DataNet, Cradle, Quick Sight etc. - Improve existing solutions and come up with next generation Data Architecture to improve scale, quality, timeliness, coverage, monitoring and security. - Develop new data models and end to data pipelines. - Create and implement Data Governance strategy for mitigating privacy and security risks. Experience with AWS technologies like Redshift, S3, AWS Glue, EMR, Kinesis, FireHose, Lambda, and IAM roles and permissions Experience with non-relational databases / data stores (object storage, document or key-value stores, graph databases, column-family databases) Our inclusive culture empowers Amazonians to deliver the best results for our customers. If you have a disability and need a workplace accommodation or adjustment during the application and hiring process, including support for the interview or onboarding process, please visit https://amazon.jobs/content/en/how-we-hire/accommodations for more information. If the country/region you’re applying in isn’t listed, please contact your Recruiting Partner.

Posted 21 hours ago

Apply

6.0 years

0 Lacs

Bengaluru

On-site

GlassDoor logo

Through our dedicated associates, Conduent delivers mission-critical services and solutions on behalf of Fortune 100 companies and over 500 governments - creating exceptional outcomes for our clients and the millions of people who count on them. You have an opportunity to personally thrive, make a difference and be part of a culture where individuality is noticed and valued every day. The ETL Developer would be responsible for driving data migration strategy and execution within complex Enterprise landscape. This role will be responsible to bring in the best practices of data migration and integrations with Salesforce. Bring in best practices for Salesforce data migration integration. Create Data migration strategy for Salesforce implementations. Define template/uniform file format for migrating data into Salesforce. Must Skill: Data Architect with 6-10 years of ETL experience and 5+ years of Informatica Cloud (IICS, ICRT) experience. 5+ years of experience on Salesforce systems. Develop comprehensive data mapping and transformation plans to align data with Salesforce data model and software solution. Good understanding of Salesforce data model and schema builder. Excellent understanding of relational database concepts and how to best implement database objects in the Salesforce. Experience integrating large sets of data into Salesforce from multiple data sources. Experience with EDI transactions. Experience in Design and Development of ETL/Data Pipelines. Excellent understanding of SOSL and SOQL and the Salesforce Security model. Full understanding of project life cycle and development methodologies. Ability to interact with technical and functional teams. Excellent oral, written communication and presentation skills. Should be able to work in offshore / onsite model. Experience: Expert in ETL development with Informatica cloud using various connectors. Experience with Real Time integrations and Batch scripting. Expert in implementing the business rules by creating various transformations, working with multiple data sources like flat files, relational and cloud database, etc. and developing mappings. Experience in using ICS workflow tasks: Session, Control Task, Command tasks, Decision tasks, Event wait, Email tasks, Pre-sessions, Post-session, and Pre/Post commands. Ability to migrate objects in all phases (DEV, QA/UAT and PRD) following standard defined processes. Performance analysis with large data sets Experience in writing technical specifications based on conceptual design and stated business requirements. Experience in designing and maintaining logical and physical data models and communicates to peers and junior associates using flowcharts, unified data language, Data flow Diagram. Good Knowledge of SQL, PL/SQL and Data Warehousing Concepts. Experience in using Salesforce SOQL is a plus. Responsibilities: Excellent troubleshooting and debugging skills in Informatica Cloud. Significant knowledge of PL/ SQL including tuning, triggers, ad hoc queries, and stored procedures. Strong analytical skills. Works under minimal supervision with some latitude for independent judgement. Prepare and package scripts and code across development, test, and QA environments. Participate in change control planning for production deployments. Conducts tasks and assignments as directed. Conduent is an Equal Opportunity Employer and considers applicants for all positions without regard to race, color, creed, religion, ancestry, national origin, age, gender identity, gender expression, sex/gender, marital status, sexual orientation, physical or mental disability, medical condition, use of a guide dog or service animal, military/veteran status, citizenship status, basis of genetic information, or any other group protected by law. People with disabilities who need a reasonable accommodation to apply for or compete for employment with Conduent may request such accommodation(s) by submitting their request through this form that must be downloaded: click here to access or download the form. Complete the form and then email it as an attachment to FTADAAA@conduent.com. You may also click here to access Conduent's ADAAA Accommodation Policy. At Conduent we value the health and safety of our associates, their families and our community. For US applicants while we DO NOT require vaccination for most of our jobs, we DO require that you provide us with your vaccination status, where legally permissible. Providing this information is a requirement of your employment at Conduent.

Posted 21 hours ago

Apply

5.0 years

4 - 6 Lacs

Bengaluru

On-site

GlassDoor logo

Job Description: We are seeking a highly experienced Java full stack Developer to join our team As a Senior Developer you will be responsible for leading and mentoring a team of Java Springboot Developers in the design development and maintenance of applications using the Springboot framework Your expertise in Java Springboot and other related technologies will be crucial in delivering high quality solutions that meet our business requirements Key Responsibilities: Responsibilities Lead and mentor a team of Java Springboot Developers in the design development and maintenance of applications Work with business stakeholders and technical teams to gather and analyze requirements for Java Springboot applications Design develop and enhance software solutions using Java Springboot including Microservices MVC Spring Data and Spring Security Write efficient and well structured code to implement business logic and functionality on the Java platform Perform unit testing and debugging to ensure the quality and reliability of developed applications Maintain and enhance existing Java Springboot applications by troubleshooting issues implementing bug fixes and optimizing performance Collaborate with other developers database administrators and system administrators to integrate Java Springboot applications with other systems and databases Develop and maintain technical documentation including system design coding standards and user manuals Stay updated with the latest Java Springboot technologies and industry trends and recommend improvements or alternative solutions to enhance system performance and efficiency Collaborate with cross functional teams to support system integration data migration and software deployment activities Participate in code reviews and provide constructive feedback to ensure adherence to coding standards and best practices Proactively identify and address potential risks or issues related to Java Springboot applications and propose appropriate solutions Provide leadership and guidance to the team and create a positive and productive work environment Manage the team s workload and ensure that projects are completed on time and within budget Delegate tasks and responsibilities to team members and provide regular feedback Identify and develop the team s strengths and weaknesses and provide opportunities for professional growth Technical Requirements: Primary Skill Java Springboot Angular React js Additional Responsibilities: Requirements Bachelor s degree in Computer Science Information Technology or a related field Minimum of 5 years of experience as a Java Springboot Developer with at least 3 years of team handling experience Strong understanding of Java programming concepts including object oriented programming data structures and algorithms Proficiency in Springboot framework including Microservices MVC Spring Data and Spring Security Extensive experience with Java development tools such as Eclipse and IntelliJ IDEA Deep familiarity with relational databases particularly MySQL and PostgreSQL Expert knowledge of Java performance tuning and optimization techniques Excellent problem solving and analytical skills Strong written and verbal communication skills with the ability to effectively communicate technical concepts to both technical and non technical stakeholders Detail oriented with a commitment to delivering high quality software solutions Proven ability to lead and mentor a team of developers Leadership and management skills Preferred Skills Experience with cloud computing platforms such as AWS Azure or Google Cloud Platform Familiar with software development methodologies such as Agile or Scrum Understanding of software version control systems such as Git or Subversion Certification in Java or Springboot or related technologies Preferred Skills: Java,Java->J2EE,Java->Microservices,Java->Springboot,Java->Struts

Posted 21 hours ago

Apply

1.0 - 2.0 years

4 - 8 Lacs

Bengaluru

On-site

GlassDoor logo

Who We Are Bain & Company is a global management consulting that helps the world’s most ambitious change-makers define the future. Across 65 offices in 40 countries, we work alongside our clients as one team with a shared ambition to achieve extraordinary results, outperform the competition and redefine industries. Since our founding in 1973, we have measured our success by the success of our clients, and we proudly maintain the highest level of client advocacy in the industry. Who You’ll work with BCN Labs is a Center of Excellence (CoE) functioning akin to a small R&D boutique startup within the Bain ecosystem, delivering end-to-end data driven client deployable solutions across a wide variety of sectors and industries. We work directly with other CoEs and Practices within Bain as part of the Expert Client Delivery system and interface with teams across the globe. We are first and foremost business thought partners working on intelligent ways of using analytical techniques and algorithms across the spectrum of disciplines that can enable building world-class solutions. Our goal is to build a disruptive high-impact business-enabled end-to-end analytical solutions delivery system across all verticals of Bain. What You Will Do We’re seeking an Associate – Back-End/Full-Stack Developer to join our team and build scalable, secure, and efficient backend services that power client-facing analytical applications and internal platforms. You’ll work closely with Front-end developers, data scientists, and business leads to implement logic, manage data flow and ensure robust performance of web applications. As an Associate – Back-End/Full-Stack Developer, you will: Build Backend Services : Develop modular, scalable server-side applications and services using Python-based frameworks such as FastAPI or Django Design & Integrate APIs : Collaborate with front-end developers and data scientists to design clean, well-documented APIs that meet evolving business needs Manage Data storage/Interactions : Write efficient database queries, optimize storage and retrieval operations, and ensure data integrity across services Collaborate across teams : Work closely with front-end developers, and data scientists to integrate business logic and ensure smooth data pipelines Uphold Code Quality : Participate in code reviews, adhere to best practices, and write maintainable and testable code Drive Improvement: Demonstrate Strong problem-solving skills, team collaboration, and a proactive attitude toward learning and improvement About You Education & Experience: Bachelor’s or Master’s degree in Computer Science, Information Technology, Engineering, or a related field. 1–2 years of real-world experience in backend development (or full-stack development) and service integration, with exposure to data-driven applications or platforms. You will fit into our team-oriented structure with a college/hostel-style way of working, having the comfort of reaching out to anyone for support that can enable our clients better. Core Technical Skills: Strong proficiency in Python, with hands-on experience building backend services using: FastAPI for high-performance, asynchronous APIs, and/or Django Rest Framework (DRF) for robust, secure and scalable APIs Experience designing and implementing RESTful APIs, including authentication, versioning, and documentation (e.g., OpenAPI/Swagger) Proficient in working with Relational Databases (e.g., PostgreSQL, MySQL, MS SQL Server) Exposure to NoSQL databases such as MongoDB or Redis (for caching) is a plus Skilled in implementing business logic, data parsing, validation, and serialization using Pydantic (FastAPI) or DRF serializers Exposure to deploying Python applications on cloud services like AWS (EC2, Lambda, S3) or Azure Familiarity with Docker, Git, and basic CI/CD practices for backend deployments. (Good-to-have) Basic understanding of React.js or Vue.js to better collaborate with front-end teams. (Good-to-have) Experience or exposure to integrating FastAPI/DRF backends with front-end clients via REST APIs. (Good-to-have) Understanding of how front-end apps consume and render API responses (e.g., JSON) and how to optimize for usability. Development Tools & Practices: Experience with version control using Git in collaborative environments (e.g., GitHub) Familiarity with sprint-based work planning and code reviews Exposure to testing frameworks (e.g., Pytest, Unittest etc.) and debugging practices What makes us a great place to work We are proud to be consistently recognized as one of the world's best places to work, a champion of diversity and a model of social responsibility. We are currently ranked the #1 consulting firm on Glassdoor’s Best Places to Work list, and we have maintained a spot in the top four on Glassdoor's list for the last 12 years. We believe that diversity, inclusion and collaboration is key to building extraordinary teams. We hire people with exceptional talents, abilities and potential, then create an environment where you can become the best version of yourself and thrive both professionally and personally. We are publicly recognized by external parties such as Fortune, Vault, Mogul, Working Mother, Glassdoor and the Human Rights Campaign for being a great place to work for diversity and inclusion, women, LGBTQ and parents.

Posted 21 hours ago

Apply

0 years

0 Lacs

Bengaluru

On-site

GlassDoor logo

Genpact (NYSE: G) is a global professional services and solutions firm delivering outcomes that shape the future. Our 125,000+ people across 30+ countries are driven by our innate curiosity, entrepreneurial agility, and desire to create lasting value for clients. Powered by our purpose – the relentless pursuit of a world that works better for people – we serve and transform leading enterprises, including the Fortune Global 500, with our deep business and industry knowledge, digital operations services, and expertise in data, technology, and AI. Inviting applications for the role of Assistant Manager, Data Analytics- Senior SME! In this role, you will be focusing on fraud detection, AML/CTF, and transaction monitoring using SQL, Python, and BI tools to develop analytical solutions and enhance risk oversight. Strong stakeholder engagement and problem-solving skills are key. Responsibilities Support the managers and business leads to ensure that the respective TM/CRA/WLM/AEoI programs are working as intended and have appropriate oversight. Using advanced SQL/Python techniques to define analytical products which meet project needs and interpret business rules into code. Utilise analytics techniques in SQL & Python to model, design, and implement new transaction monitoring scenarios Deliver robust documentation, code and processes using Confluence, Gitlab, and SharePoint to ensure a clear audit trail of decisions, implementation and lineage of data products. Qualifications we seek in you! Minimum Qualifications / Skills Technical Skills: Intermediate SQL proficiency for data extraction, modeling, and analytics. Beginner Python skills for data analysis, scripting, and automation. Experience working with relational databases to manage and manipulate large datasets. Expertise in Business Intelligence & Data Visualization using tools like Power BI, Tableau, or Qlik Sense. Strong data quality management capabilities and ability to spot trends/ quality issues / anomalies in new data sources and identify ways to work around these issues. Soft Skills & Work Experience: Ability to translate business requirements into analytical solutions, working closely with both technical and non-technical stakeholders. Strong problem-solving mindset to detect anomalies, identify patterns, and enhance risk coverage. Ability to work under pressure and meet deadlines including the ability to multi-task, prioritise and balance competing demands and expectations. Simplify the complex – The ability to generate insight from data and engage and communicate those insights effectively with non-technical business customers. Strong documentation and governance skills, ensuring clear audit trails of decisions and data processes. Preferred Qualifications/ Skills Financial services experience especially within banking or wealth management. Experience in financial crime risk management, with emphasis on AML/CTF and Sanctions. Experience in AWS tool stack for analytics (EMR, S3, etc). Experience in data visualisation tools such as PowerBI, Qliksense or Tableau. Genpact is an Equal Opportunity Employer and considers applicants for all positions without regard to race, color, religion or belief, sex, age, national origin, citizenship status, marital status, military/veteran status, genetic information, sexual orientation, gender identity, physical or mental disability or any other characteristic protected by applicable laws. Genpact is committed to creating a dynamic work environment that values respect and integrity, customer focus, and innovation. For more information, visit www.genpact.com . Follow us on Twitter, Facebook, LinkedIn, and YouTube. Furthermore, please do note that Genpact does not charge fees to process job applications and applicants are not required to pay to participate in our hiring process in any other way. Examples of such scams include purchasing a 'starter kit,' paying to apply, or purchasing equipment or training. Job Assistant Manager Primary Location India-Bangalore Schedule Full-time Education Level Bachelor's / Graduation / Equivalent Job Posting Jun 27, 2025, 9:45:22 AM Unposting Date Ongoing Master Skills List Operations Job Category Full Time

Posted 21 hours ago

Apply

175.0 years

0 Lacs

Bengaluru

On-site

GlassDoor logo

At American Express, our culture is built on a 175-year history of innovation, shared values and Leadership Behaviors, and an unwavering commitment to back our customers, communities, and colleagues. As part of Team Amex, you'll experience this powerful backing with comprehensive support for your holistic well-being and many opportunities to learn new skills, develop as a leader, and grow your career. Here, your voice and ideas matter, your work makes an impact, and together, you will help us define the future of American Express. This role is for a Global Capabilities Product Owner in the Regulatory Automation and Change Management team within Financial Reporting Quality Assurance Organization (FRQA), in support of the Regulatory Reporting Automation program. Global Capabilities Product Owner team is responsible for delivering regulatory data and automation capabilities required to support the regulatory reporting team. Some of the key capabilities supported by the team include areas like Regulatory data domain, Cash flow projections, Analytical drill down, Counterparty classifications, Basel RWA calculations etc. This individual will coordinate with several groups within American Express during the course of designing, implementing, and migrating the implemented solution into production. The individual selected will partner closely with the Data Sourcing Architects and Process Owners to drive the priorities of the technology scrum team and ensure that the software features developed aligns with original requirements provided. Also, the individual would need to monitor project progress, solve issues that arise and write technical features and user stories. This team has a holistic understanding of numerous data sources, processing & regulatory reports. The Product Owner is a collaborator, a well-organized, action-oriented individual with exceptional leadership and functional expertise, confident in presenting, facilitating, and building a network of strong relationships across our organization. This role will require strong collaboration with Technology to design how functionality will work and design the validation process at regular intervals that the software features developed align with original requirements provided to the team. How will you make an impact in this role? Participate in daily stand ups with the pods (implementation groups for various portfolios of the Company for data sourcing and regulatory classification and reporting), leading and delivering efficient solution to complex prioritization and business requirement Lead and guide regulatory reporting data and automation requirements on existing processes and datasets to understand and support Point of Arrival (POA) process design. Develop functional requirement documentation and process specific design documentation to support regulatory report owner requirements and testing lifecycle processes. Understand and guide determining portfolios, data elements and attribute analysis grain of data required for designing processes. Work with complex cross functional teams: Engineers, Architects, governance & business partners Closely collaborate with users to understand the pain points, requirements, feedback and provide them with timely resolutions Identify and support business requirements, functional design, prototyping, testing, training, and supporting implementations. Design and build ongoing data process controls by collaborating with Technology and Data Governance as needed Manage program blocking issues, anticipate and make tradeoffs, and balance the business needs versus technical or operational constraints Document and understand core components of solution architecture including data patterns, data-related capabilities, and standardization and conformance of disparate datasets. Lead and guide the implementation of master and reference data to be used across operational and reporting processes. Coordinate with various Product Owners, Process Owners, Subject Matter Experts, Solution Architecture colleagues, and Data Management team to ensure builds are appropriate. Knowledgeable in development methodologies, using tools such as SQL, to drive understanding of the system functionality and expected automation results. Minimum Qualifications Degree in Finance/Accounting and/or Information Technologies. 5+ years of work experience in the US Federal Reserve/financial US regulatory reporting, banking/financial services, and/or Controllership. Working knowledge of Scaled Agile Framework, have an Agile mindset, and can embrace new opportunities and adapt easily to change Strong knowledge and working experience in regulatory regulations and reporting is required and any exposure to US regulations is preferred. Experience eliciting and documenting technical business requirements via the creation of features and user stories Strong understanding of relational database concepts and experience working in a big data environment (Hadoop / Cornerstone) preferred both on-prem and cloud capabilities IT Data management experience Previous work experience in various IT disciplines such as Infrastructure, software development, data management or data analytics Exhibits organizational skills with the ability to meet/exceed critical deadlines and manage multiple deliverables simultaneously. A self-starter, proactive team player with a passion to consistently deliver high quality service and exceed customers’ expectations. Strong analytical and problem-solving skills as well as the ability to create impactful relationships with key stakeholders Excellent relationship building, presentation and collaboration skills. Excellent written and verbal communications with ability to communicate highly complex concepts and processes in simple terms and pragmatically across Finance, Business and Technology stakeholders. Display thought leadership, drive process, and support work/life balance initiatives. Preferred Qualifications Knowledge of US Regulatory Reports (Y9C, Y14, Y15, 2052a, amongst others) Working exposure in development of financial data domains to support regulatory and analytical requirements for large scale banking/financial organizations SaFe Agile certified is a plus Project Management Professional (PMP) certified is a plus Knowledge and working experience on AxiomSL/Adenza/Nasdaq solutions are preferred SQL and data analysis experience Testing management and execution experience is a plus Foundational data architecture principles and data management experience Certified Data Management Professional (CDMP) is a plus We back you with benefits that support your holistic well-being so you can be and deliver your best. This means caring for you and your loved ones' physical, financial, and mental health, as well as providing the flexibility you need to thrive personally and professionally: Competitive base salaries Bonus incentives Support for financial-well-being and retirement Comprehensive medical, dental, vision, life insurance, and disability benefits (depending on location) Flexible working model with hybrid, onsite or virtual arrangements depending on role and business need Generous paid parental leave policies (depending on your location) Free access to global on-site wellness centers staffed with nurses and doctors (depending on location) Free and confidential counseling support through our Healthy Minds program Career development and training opportunities American Express is an equal opportunity employer and makes employment decisions without regard to race, color, religion, sex, sexual orientation, gender identity, national origin, veteran status, disability status, age, or any other status protected by law. Offer of employment with American Express is conditioned upon the successful completion of a background verification check, subject to applicable laws and regulations.

Posted 21 hours ago

Apply

6.0 years

6 - 9 Lacs

Bengaluru

On-site

GlassDoor logo

Job Requirements Looking for energetic, self-motivated and exceptional Data Engineer to work on extraordinary enterprise products based on AI and Big Data engineering. He/she will work with star team of Architects, Data Scientists/AI Specialists, Data Engineers, Integration Specialists and UX developers. Work Experience Advanced working SQL knowledge and experience working with relational databases, query authoring (SQL) as well as working familiarity with a variety of databases. Experience building and optimizing ‘big data’ data pipelines, architectures and data sets. Experience performing root cause analysis on internal and external data and processes to answer specific business questions and identify opportunities for improvement. Strong analytic skills related to working with unstructured datasets. Experience supporting and working with cross-functional teams in a dynamic environment. 6+ years of experience in a Data Engineer role, who has attained a Graduate degree in Computer Science, Statistics, Information Systems or another quantitative field. Should have experience using the following software/tools: Experience with relational SQL and NoSQL databases, including Postgres and RDS- MSSQL. Experience in PySpark programming.

Posted 21 hours ago

Apply

4.0 years

6 - 8 Lacs

Bengaluru

On-site

GlassDoor logo

Job Description : Chubb is the world’s largest publicly traded property and casualty insurer. With operations in 54 countries, Chubb provides commercial and personal property and casualty insurance, personal accident and supplemental health insurance, reinsurance and life insurance to a diverse group of clients. Chubb is also defined by its extensive product and service offerings, broad distribution capabilities, exceptional financial strength and local operations globally. Ideal candidate for this role is someone with a strong background in P&C underwriting product knowledge, business and data analysis who is eager to work with large complex datasets in a big data environment. You are adept at implementing prefill data solutions for P&C products using external data sources and data science models. You have a critical eye for perceiving actionable insights from multitude of data sources ranging from structured and unstructured. You have an innate ability to enage, influence and champion innovative solutions and are never afraid to come up with out of the box solutions to traditional P&C underwriting. You are a self-starter who will take ownership of your projects and deliver high-quality data-driven analytics solutions. Primary Job Responsibilities: Designing, creating and implementing dashboards and data visualization solutions using QlikSense. Extracting, transforming and loading data from disparate sources into the QlikSense application. Designing, building, testing and deploying QlikSense scripts to import data from source systems and test QlikSense dashboards to meet customer requirements. Interacting with business users to gather requirements and translate those requirements into QlikSense visualizations. Troubleshooting and resolving issues related to QlikSense dashboards and data issues. Developing data models to meet the needs of the organization's information systems. Optimizing data load times and improving the performance of reporting dashboards. Setting up QlikSense user access controls and maintaining document version control. Integrating QlikSense with other platforms and systems to facilitate automated and on-demand data retrieval. Documenting QlikSense processes, projects, and system configurations. Ensuring all QlikSense dashboards and processes comply with enterprise standards and protocols. Participate in continuously improving and optimizing the QlikSense application. Training end-users on how to use QlikSense dashboards and solving their queries related to it. Providing ongoing support and maintenance of the QlikSense platform and dashboards. Keeping up-to-date with latest technologies, trends and techniques related to QlikSense. Qualification/Experience Bachelors in Computer Science, Information Systems or related educational background 4+ years of work experience as Business/Data/Product Analyst. Solid experience in creating Dashboards using QlikSense Programming experience with Qlik scripting & Data Model building is a must. Knowledge of Microsoft Azure QMC & Mashups knowledge experience is a plus. Excellent working knowledge of SQL & experience querying data platforms , relational databases, MySQL, Oracle, DB2 etc. Demonstrated knowledge of and experience in business data definition, data profiling and data mapping analysis Excellent understanding of how technology impacts the business Must have strong business acumen as well as technical solutions expertise Strong ability to independently perform analysis of business workflow, and technology issues to facilitate decision-making Multi-tasking along with strong organization and time management skills is a must Must have strong problem solving skills Excellent interpersonal, communication, presentation, documentation skills Knowledge of Agile methodologies Job Description : Chubb is the world’s largest publicly traded property and casualty insurer. With operations in 54 countries, Chubb provides commercial and personal property and casualty insurance, personal accident and supplemental health insurance, reinsurance and life insurance to a diverse group of clients. Chubb is also defined by its extensive product and service offerings, broad distribution capabilities, exceptional financial strength and local operations globally. Ideal candidate for this role is someone with a strong background in P&C underwriting product knowledge, business and data analysis who is eager to work with large complex datasets in a big data environment. You are adept at implementing prefill data solutions for P&C products using external data sources and data science models. You have a critical eye for perceiving actionable insights from multitude of data sources ranging from structured and unstructured. You have an innate ability to enage, influence and champion innovative solutions and are never afraid to come up with out of the box solutions to traditional P&C underwriting. You are a self-starter who will take ownership of your projects and deliver high-quality data-driven analytics solutions. Primary Job Responsibilities: Designing, creating and implementing dashboards and data visualization solutions using QlikSense. Extracting, transforming and loading data from disparate sources into the QlikSense application. Designing, building, testing and deploying QlikSense scripts to import data from source systems and test QlikSense dashboards to meet customer requirements. Interacting with business users to gather requirements and translate those requirements into QlikSense visualizations. Troubleshooting and resolving issues related to QlikSense dashboards and data issues. Developing data models to meet the needs of the organization's information systems. Optimizing data load times and improving the performance of reporting dashboards. Setting up QlikSense user access controls and maintaining document version control. Integrating QlikSense with other platforms and systems to facilitate automated and on-demand data retrieval. Documenting QlikSense processes, projects, and system configurations. Ensuring all QlikSense dashboards and processes comply with enterprise standards and protocols. Participate in continuously improving and optimizing the QlikSense application. Training end-users on how to use QlikSense dashboards and solving their queries related to it. Providing ongoing support and maintenance of the QlikSense platform and dashboards. Keeping up-to-date with latest technologies, trends and techniques related to QlikSense. Qualification/Experience Bachelors in Computer Science, Information Systems or related educational background 4+ years of work experience as Business/Data/Product Analyst. Solid experience in creating Dashboards using QlikSense Programming experience with Qlik scripting & Data Model building is a must. Knowledge of Microsoft Azure QMC & Mashups knowledge experience is a plus. Excellent working knowledge of SQL & experience querying data platforms , relational databases, MySQL, Oracle, DB2 etc. Demonstrated knowledge of and experience in business data definition, data profiling and data mapping analysis Excellent understanding of how technology impacts the business Must have strong business acumen as well as technical solutions expertise Strong ability to independently perform analysis of business workflow, and technology issues to facilitate decision-making Multi-tasking along with strong organization and time management skills is a must Must have strong problem solving skills Excellent interpersonal, communication, presentation, documentation skills Knowledge of Agile methodologies

Posted 21 hours ago

Apply

Exploring Relational Jobs in India

The job market in India for relational roles is thriving with opportunities for individuals who excel in building and maintaining relationships with clients, customers, and colleagues. These roles require strong communication, interpersonal skills, and the ability to work well with others. If you are considering a career in a relational job, India offers a plethora of options for you to explore.

Top Hiring Locations in India

  1. Mumbai
  2. Bangalore
  3. Delhi
  4. Hyderabad
  5. Pune

These cities are known for their vibrant job markets and high demand for professionals in relational roles.

Average Salary Range

The average salary range for relational professionals in India varies based on experience and location. Entry-level positions may start at around INR 3-5 lakhs per annum, while experienced professionals can earn upwards of INR 10-15 lakhs per annum.

Career Path

In the field of relational jobs, career progression typically follows a path from entry-level positions to more senior roles. For example: - Relationship Manager - Senior Relationship Manager - Relationship Director

With experience and expertise, professionals in relational roles can move up the ladder and take on more strategic and leadership positions.

Related Skills

Apart from strong relational skills, professionals in this field are often expected to have skills such as: - Communication - Negotiation - Problem-solving - Customer service

Having a combination of these skills can make you a valuable asset in a relational job role.

Interview Questions

  • What experience do you have in building and maintaining relationships with clients? (basic)
  • Can you give an example of a challenging situation you faced in a client relationship and how you resolved it? (medium)
  • How do you handle conflicts with colleagues or clients? (medium)
  • What strategies do you use to gain the trust of your clients? (basic)
  • Have you ever had to deal with a difficult client? How did you handle the situation? (medium)
  • How do you prioritize your tasks when managing multiple client relationships? (basic)
  • What do you think are the key qualities of a successful relationship manager? (basic)
  • How do you stay updated on industry trends and changes that may impact your client relationships? (medium)
  • Can you provide an example of a successful cross-selling or upselling experience with a client? (medium)
  • How do you handle rejection or negative feedback from clients? (basic)
  • Describe a time when you had to collaborate with a team to achieve a common goal. (basic)
  • How do you measure the success of your client relationships? (medium)
  • Have you ever had to terminate a client relationship? If so, how did you handle it? (medium)
  • How do you handle high-pressure situations when dealing with clients? (medium)
  • What CRM tools or software have you used in the past to manage client relationships? (basic)
  • How do you adapt your communication style to different types of clients? (basic)
  • Can you give an example of a time when you went above and beyond to meet a client's needs? (medium)
  • How do you handle confidential information shared by clients? (basic)
  • Describe a time when you had to manage a client crisis. How did you handle it? (medium)
  • How do you keep track of important client details and interactions? (basic)
  • What strategies do you use to ensure long-term client retention? (medium)
  • How do you handle a situation where a client is not satisfied with your service? (medium)
  • Describe a successful partnership or collaboration you initiated with a client. (medium)
  • How do you stay motivated and engaged in your client relationships over time? (basic)
  • Can you provide an example of a time when you had to manage conflicting priorities among multiple clients? (medium)

Conclusion

As you navigate the job market for relational roles in India, remember to showcase your strong communication and interpersonal skills during interviews. Prepare well, demonstrate your ability to build and maintain relationships effectively, and apply for opportunities confidently. With the right skills and attitude, you can excel in the field of relational jobs in India. Good luck with your job search!

cta

Start Your Job Search Today

Browse through a variety of job opportunities tailored to your skills and preferences. Filter by location, experience, salary, and more to find your perfect fit.

Job Application AI Bot

Job Application AI Bot

Apply to 20+ Portals in one click

Download Now

Download the Mobile App

Instantly access job listings, apply easily, and track applications.

Featured Companies