Get alerts for new jobs matching your selected skills, preferred locations, and experience range. Manage Job Alerts
3.0 years
6 - 9 Lacs
Hyderābād
On-site
- 3+ years of program or project management experience - 3+ years of working cross functionally with tech and non-tech teams experience - 3+ years of defining and implementing process improvement initiatives using data and metrics experience - Knowledge of Excel (Pivot Tables, VLookUps) at an advanced level and SQL - Experience defining program requirements and using data and metrics to determine improvements Selling Partner Support (SPS) is responsible for creating a trustworthy shopping experience across Amazon stores worldwide by protecting customers, brands, selling partners and Amazon from fraud, counterfeit, and abuse as well as empowering, providing world‐class support, and building loyalty with Amazon’s millions of selling partners. We value individual expression, respect different opinions, and work together to create a culture where each of us is able to contribute fully. Our unique backgrounds and perspectives strengthen our ability to achieve Amazon's mission of being Earth's most customer-centric company. Within SPS, Global Process Management (GPM) strives to make Amazon the best way for Partners to reach customers locally and globally and to operate their businesses, driven by the accurate and efficient support and solutions we provide them. GPM focuses on both preventing Selling Partner (Seller, Vendor and Brand Registry) contacts based on knowledge obtained during our support interactions, and for handling those contacts with quality and efficiency. Key job responsibilities Key job responsibilities • Interfacing between Amazon business teams and Selling Support Operations to facilitate changes. • Collaborating with operational, training, product, and software development teams to identify, define and specify solutions that create the conditions for Selling Partner and Associate success and satisfaction. • Establish collaborative relationships with business teams to build roadmaps that will identify and reduce contacts (both incoming and productivity efforts) and reduce effort and/or improve SP experience. • Problem-solving, strategic to real-time, requiring extensive use of data collection and analysis, and preparing and executing regular program updates to senior management. • Being a visible and vocal role model across the wider business for Amazon’s customer-centric culture, championing Selling Partner needs and using data and technology to anticipate and exceed them. Basic qualifications - 3+ years of program or project management experience - 3+ years of defining and implementing process improvement initiatives using data and metrics experience - Knowledge of Excel (Pivot Tables, VLookUps) at an advanced level and SQL - Experience using data and metrics to determine and drive improvements - Experience working cross functionally with tech and non-tech teams 3+ years of driving end to end delivery, and communicating results to senior leadership experience 3+ years of driving process improvements experience Experience in stakeholder management, dealing with multiple stakeholders at varied levels of the organization Experience building processes, project management, and schedules Our inclusive culture empowers Amazonians to deliver the best results for our customers. If you have a disability and need a workplace accommodation or adjustment during the application and hiring process, including support for the interview or onboarding process, please visit https://amazon.jobs/content/en/how-we-hire/accommodations for more information. If the country/region you’re applying in isn’t listed, please contact your Recruiting Partner.
Posted 10 hours ago
3.0 years
4 - 6 Lacs
Hyderābād
On-site
Required Skills: 3+ years of experience with Power Platform (Includes one or more of the following: Power Apps, Power Automate, Power BI, Strong experience in Dataverse. 3+ years of experience with Azure (or public cloud platform such as AWS, GCP, etc) 3+ years of experience working with SQL and RDBMS systems such as SQL Server, Oracle, Postgres, MySQL, etc) 3+ years of experience working with enterprise grade programming languages such as Python, Java, C#, etc Requirements: Work collaboratively as a key team player in a cohesive, supportive environment. Take full ownership of your role, consistently aiming to exceed client expectations. Design and implement robust workflow automation solutions using industry standard low-code/no-code (LCNC) platforms, following industry standard best practices, company policies and security guidelines Coordinate with Platform administration teams to follow as well as promote Application Lifecycle Management, Platform Lifecycle Management, logging, monitoring & alerting best practices. Collaborate with Platform architects and administrators to follow as well as promote governance, security, performance and scalability best practices. Drive the adoption of workflow automation platforms within the organization with an “Automation-first” mindset Apply user centric design principles to develop apps and workflows with consistent, intuitive and user-friendly interfaces Tackle and resolve complex issues related to performance, reliability, and security. Guide and support junior and mid-level engineers through mentorship and technical advice. Learn, test, adopt and recommend the right use of the latest developments in Automation technologies. Assist with production support, addressing and resolving live environment issues. Demonstrate flexibility in working with diverse, global, cross-functional project teams. Lead agile development with Product owner(s) by planning, prioritizing, designing, testing, and managing end-to-end solution delivery
Posted 10 hours ago
5.0 years
5 - 7 Lacs
Hyderābād
On-site
About the Role: Grade Level (for internal use): 09 The Role: Senior Analyst, BI Sales Analytics The Team: The Senior Analyst, BI Sales Analytics will play a crucial role within the Data Analytics and Insights group under the Customer Experience function of S&P Global Market Intelligence. Our team is known for its analytical excellence and is highly sought after by various stakeholders across the organization. We are looking for a talented Senior Analyst to contribute to our Sales analytics and planning capabilities, providing actionable insights and support for reporting and data analytics requirements. The Impact: In this role, you will develop and deliver analytical solutions that empower business leaders to make informed decisions and drive sales performance. You will work closely with cross-functional teams to build and enhance dashboards that provide insights into sales performance, customer behavior, and market trends. What’s in it for you: You will have the opportunity to develop your analytical skills and contribute to strategic initiatives that drive revenue growth for Market Intelligence. Your work will directly influence sales strategies and operational effectiveness, making a tangible impact on the organization. Responsibilities: Collaborate with stakeholders to understand data requirements and develop analytical models that address business needs. Analyze large datasets from various sources to extract meaningful insights and trends that inform sales strategies. Assist in the development and maintenance of sales dashboards and performance tracking tools. Ensure data integrity and accuracy in all reporting and analytics work, adhering to best practices in data management. Support the team in fostering a data-driven culture by promoting the use of analytics across the organization. Present findings and recommendations to various audiences, including senior leadership, in a clear and compelling manner. What we looking for: Bachelor’s degree in a relevant field (e.g., Business, Engineering, Data Science). 5+ years of experience in data analytics within the Information Services industry. Strong analytical skills with the ability to manipulate and interpret large datasets. Excellent communication skills, with the ability to tailor messages to diverse audiences. Proven ability to prioritize tasks and manage multiple projects simultaneously. Preferred Qualifications: Master’s degree or MBA is a plus. Experience with data visualization tools such as Power BI, Tableau, or similar. Proficiency in Excel, SQL, and familiarity with programming languages such as R or Python. Knowledge of sales operations and metrics within the Information Services industry. Ability to work collaboratively in a fast-paced environment and adapt to changing priorities. About S&P Global Market Intelligence At S&P Global Market Intelligence, a division of S&P Global we understand the importance of accurate, deep and insightful information. Our team of experts delivers unrivaled insights and leading data and technology solutions, partnering with customers to expand their perspective, operate with confidence, and make decisions with conviction. For more information, visit www.spglobal.com/marketintelligence . What’s In It For You? Our Purpose: Progress is not a self-starter. It requires a catalyst to be set in motion. Information, imagination, people, technology–the right combination can unlock possibility and change the world. Our world is in transition and getting more complex by the day. We push past expected observations and seek out new levels of understanding so that we can help companies, governments and individuals make an impact on tomorrow. At S&P Global we transform data into Essential Intelligence®, pinpointing risks and opening possibilities. We Accelerate Progress. Our People: We're more than 35,000 strong worldwide—so we're able to understand nuances while having a broad perspective. Our team is driven by curiosity and a shared belief that Essential Intelligence can help build a more prosperous future for us all. From finding new ways to measure sustainability to analyzing energy transition across the supply chain to building workflow solutions that make it easy to tap into insight and apply it. We are changing the way people see things and empowering them to make an impact on the world we live in. We’re committed to a more equitable future and to helping our customers find new, sustainable ways of doing business. We’re constantly seeking new solutions that have progress in mind. Join us and help create the critical insights that truly make a difference. Our Values: Integrity, Discovery, Partnership At S&P Global, we focus on Powering Global Markets. Throughout our history, the world's leading organizations have relied on us for the Essential Intelligence they need to make confident decisions about the road ahead. We start with a foundation of integrity in all we do, bring a spirit of discovery to our work, and collaborate in close partnership with each other and our customers to achieve shared goals. Benefits: We take care of you, so you can take care of business. We care about our people. That’s why we provide everything you—and your career—need to thrive at S&P Global. Our benefits include: Health & Wellness: Health care coverage designed for the mind and body. Flexible Downtime: Generous time off helps keep you energized for your time on. Continuous Learning: Access a wealth of resources to grow your career and learn valuable new skills. Invest in Your Future: Secure your financial future through competitive pay, retirement planning, a continuing education program with a company-matched student loan contribution, and financial wellness programs. Family Friendly Perks: It’s not just about you. S&P Global has perks for your partners and little ones, too, with some best-in class benefits for families. Beyond the Basics: From retail discounts to referral incentive awards—small perks can make a big difference. For more information on benefits by country visit: https://spgbenefits.com/benefit-summaries Global Hiring and Opportunity at S&P Global: At S&P Global, we are committed to fostering a connected and engaged workplace where all individuals have access to opportunities based on their skills, experience, and contributions. Our hiring practices emphasize fairness, transparency, and merit, ensuring that we attract and retain top talent. By valuing different perspectives and promoting a culture of respect and collaboration, we drive innovation and power global markets. Recruitment Fraud Alert: If you receive an email from a spglobalind.com domain or any other regionally based domains, it is a scam and should be reported to reportfraud@spglobal.com . S&P Global never requires any candidate to pay money for job applications, interviews, offer letters, “pre-employment training” or for equipment/delivery of equipment. Stay informed and protect yourself from recruitment fraud by reviewing our guidelines, fraudulent domains, and how to report suspicious activity here . ----------------------------------------------------------- Equal Opportunity Employer S&P Global is an equal opportunity employer and all qualified candidates will receive consideration for employment without regard to race/ethnicity, color, religion, sex, sexual orientation, gender identity, national origin, age, disability, marital status, military veteran status, unemployment status, or any other status protected by law. Only electronic job submissions will be considered for employment. If you need an accommodation during the application process due to a disability, please send an email to: EEO.Compliance@spglobal.com and your request will be forwarded to the appropriate person. US Candidates Only: The EEO is the Law Poster http://www.dol.gov/ofccp/regs/compliance/posters/pdf/eeopost.pdf describes discrimination protections under federal law. Pay Transparency Nondiscrimination Provision - https://www.dol.gov/sites/dolgov/files/ofccp/pdf/pay-transp_%20English_formattedESQA508c.pdf ----------------------------------------------------------- 20 - Professional (EEO-2 Job Categories-United States of America), SLSGRP202.1 - Middle Professional Tier I (EEO Job Group) Job ID: 316972 Posted On: 2025-06-18 Location: Hyderabad, Telangana, India
Posted 10 hours ago
5.0 years
7 - 10 Lacs
Hyderābād
On-site
Optum is a global organization that delivers care, aided by technology to help millions of people live healthier lives. The work you do with our team will directly improve health outcomes by connecting people with the care, pharmacy benefits, data and resources they need to feel their best. Here, you will find a culture guided by inclusion, talented peers, comprehensive benefits and career development opportunities. Come make an impact on the communities we serve as you help us advance health optimization on a global scale. Join us to start Caring. Connecting. Growing together. Primary Responsibilities: Collaborate with product owners, systems analysts, and software engineers to deliver against an agile roadmap Collaborate with architects and other software engineers to evaluate functional and non-functional requirements and deliver creative, high-quality solutions Follow and improve on processes for continuous delivery and DevOps within your team Build and main configuration management and automation tools, deployment strategies/processes, monitoring tools Experienced in Software Engineering practices like Reliability Engineering, Deployment planning, Fault Tolerant architecture, Test Automation Experience in Data as a Service and Container as a Service models Experience in incident ticket tracking tools and processes like ServiceNow Collaborate on quality strategies that ensure our data platform is correct, resilient, scalable, and secure Support applications throughout the SDLC from design to Production Deployment Participate in and provide input for system analysis, design walkthroughs and code reviews Participate in defect review and triage Adhere to design/coding standards and constantly improve the way we build and deliver software Comply with the terms and conditions of the employment contract, company policies and procedures, and any and all directives (such as, but not limited to, transfer and/or re-assignment to different work locations, change in teams and/or work shifts, policies in regards to flexibility of work benefits and/or work environment, alternative work arrangements, and other decisions that may arise due to the changing business environment). The Company may adopt, vary or rescind these policies and directives in its absolute discretion and without any limitation (implied or otherwise) on its ability to do so Required Qualifications: Bachelor’s degree in engineering or equivalent experience 5+ years of experience in designing ETL/ELT solutions using tools like Azure Data Factory and Azure Databricks.OR Snowflake 4+ years of designing and implementing Cloud based applications Experience with Kafka for real-time data streaming and integration Experience in creating and maintaining Azure Infrastructure as Code using Terraform and GitHub Actions Experience in creating and configuring CI/CD pipelines using GitHub Actions for various Azure services Experience with these or similar technologies: Azure Kubernetes, Azure Databricks, Docker, GitHub/GitHub Actions, Kafka Experience utilizing version control systems (e.g., Git) for code management and familiarity with CI/CD (Continuous Integration/Continuous Deployment) methodologies for automated software delivery. These are essential for modern development practices Experience building data intensive systems in a public, hybrid, or private cloud environment Experience in collaborating with teams in an Agile Delivery/onshore-offshore model Firsthand experience with specific AI techniques and frameworks, such as Large Language Models (LLMs), Retrieval Augmented Generation (RAG), or autonomous agents Expertise in programming languages such as Python Working knowledge of RESTFul APIs Proficiency in Snowflake for data wrangling and management Solid proficiency in SQL (Structured Query Language) for data querying and manipulation, along with expertise in at least one programming language commonly used in data engineering, such as Python or Scala. These are fundamental technical skills In-depth understanding of managing security aspects of Azure infrastructure Proven ability to use DBT to build and maintain data marts and views Proven ability to configure, set up, and maintain GitHub for various code repositories Proven solid problem-solving skills and ability to diagnose and troubleshoot technical issues Proven excellent communication skills for explaining technical issues and solutions Preferred Qualifications: Relevant cloud certifications, particularly Microsoft Azure certifications such as Azure Data Engineer Associate or Azure Solutions Architect Expert Thorough understanding of data modeling principles (conceptual, logical, and physical) and comprehensive knowledge of data warehousing concepts and best practices. This is crucial for designing effective data solutions Proven excellent analytical skills and proven ability to work with delivery teams to think out of box At UnitedHealth Group, our mission is to help people live healthier lives and make the health system work better for everyone. We believe everyone-of every race, gender, sexuality, age, location and income-deserves the opportunity to live their healthiest life. Today, however, there are still far too many barriers to good health which are disproportionately experienced by people of color, historically marginalized groups and those with lower incomes. We are committed to mitigating our impact on the environment and enabling and delivering equitable care that addresses health disparities and improves health outcomes - an enterprise priority reflected in our mission.
Posted 10 hours ago
2.0 years
0 Lacs
India
Remote
Hiring for Senior Data Scientist Location :- Madhapur ( Hybrid ) Bachelor’s or Master’s degree in Computer Science, Data Science, Statistics, or a related field. Proven experience as a Data Scientist or Data Science Trainer (2–5+ years preferred). Proficiency in Python, R, SQL, machine learning frameworks (e.g., Scikit-learn, TensorFlow, PyTorch), and visualization tools (e.g., Matplotlib, Tableau, Power BI). Strong communication, presentation, and public speaking skills. Experience with LMS platforms and remote teaching tools (Zoom, Google Meet, etc.). Job Types: Part-time, Internship Contract length: 2 months Expected hours: 2 per week Schedule: Day shift Work Location: In person
Posted 10 hours ago
4.0 years
0 Lacs
Hyderābād
On-site
Join Amgen’s Mission of Serving Patients At Amgen, if you feel like you’re part of something bigger, it’s because you are. Our shared mission—to serve patients living with serious illnesses—drives all that we do. Since 1980, we’ve helped pioneer the world of biotech in our fight against the world’s toughest diseases. With our focus on four therapeutic areas –Oncology, Inflammation, General Medicine, and Rare Disease– we reach millions of patients each year. As a member of the Amgen team, you’ll help make a lasting impact on the lives of patients as we research, manufacture, and deliver innovative medicines to help people live longer, fuller happier lives. Our award-winning culture is collaborative, innovative, and science based. If you have a passion for challenges and the opportunities that lay within them, you’ll thrive as part of the Amgen team. Join us and transform the lives of patients while transforming your career. Scientific Business Analyst – Research Data and Analytics What you will do Let’s do this. Let’s change the world. In this vital role, you will primarily focus on analyzing scientific requirements from Global Research and translating them into efficient and effective information systems solutions. As a domain expert, the prospective BA collaborate with cross-functional teams to identify data product enhancement opportunities, perform data analysis, solve issues, and support system implementation and maintenance. Additionally, it will involve development of data product launch and user adoption strategy of Amgen Research Foundational Data Systems. Your expertise in business process analysis and technology will contribute to the successful delivery of IT solutions that drive operational efficiency and meet business objectives. Collaborate with geographically dispersed teams, including those in the US, EU and other international locations. Partner and ensure alignment of the Amgen India DTI site leadership and follow global standards and practices. Foster a culture of collaboration, innovation, and continuous improvement. Function as a Scientific Business Analyst, providing domain expertise for Research Data and Analytics within a Scaled Agile Framework (SAFe) product team Serve as Agile team scrum master or project manager as needed Serve as a liaison between global DTI functional areas and global research scientists, prioritizing their needs and expectations Create functional analytics dashboards and fit-for-purposes applications for quantitative research, scientific analysis and business intelligence (Databricks, Spotfire, Tableau, Dash, Streamlit, RShiny) Handle a suite of custom internal platforms, commercial off-the-shelf (COTS) software, and systems integrations Translate complex scientific and technological needs into clear, actionable requirements for development teams Develop and maintain release deliverables that clearly outlines the planned features and enhancements, timelines, and milestones Identify and handle risks associated with the systems, including technological risks, scientific validation, and user acceptance Develop documentations, communication plans and training plans for end users Ensure scientific data operations are scoped into building Research-wide Artificial Intelligence/Machine Learning capabilities Ensure operational excellence, cybersecurity and compliance. What we expect of you We are all different, yet we all use our unique contributions to serve patients. This role requires expertise in biopharma scientific domains as well as informatics solution delivery. Additionally, extensive collaboration with global teams is required to ensure seamless integration and operational excellence. The ideal candidate will have a solid background in the end-to-end software development lifecycle and be a Scaled Agile practitioner, coupled with change management and transformation experience. This role demands the ability to deliver against key organizational strategic initiatives, develop a collaborative environment, and deliver high-quality results in a matrixed organizational structure. Basic Qualifications/Skills: Doctorate degree OR Master’s degree and 4 to 6 years of Life Science/Biotechnology/Pharmacology/Information Systems experience OR Bachelor’s degree and 6 to 8 years of Life Science/Biotechnology/Pharmacology/Information Systems experience OR Diploma and 10 to 12 years of Life Science/Biotechnology/Pharmacology/Information Systems experience Excellent problem-solving skills and a passion for solving complex challenges in drug discovery with technology and data Superb communication skills and experience creating impactful slide decks with data Collaborative spirit and effective communication skills to work seamlessly in a multi-functional team Familiarity with data analytics and scientific computing platforms such as Databricks, Dash, Streamlit, RShiny, Spotfire, Tableau and related programming languages like SQL, python, R Preferred Qualifications/Skills: BS, MS or PhD in Bioinformatics, Computational Biology, Computational Chemistry, Life Sciences, Computer Science or Engineering 3+ years of experience in implementing and supporting biopharma scientific research data analytics Demonstrated expertise in a scientific domain area and related technology needs Understanding of semantics and FAIR (Findability, Accessibility Interoperability and Reuse) data concepts Understanding of scientific data strategy, data governance, data infrastructure Experience with cloud (e.g. AWS) and on-premise compute infrastructure Familiarity with advanced analytics, AI/ML and scientific computing infrastructure, such as High Performance Compute (HPC) environments and clusters (e.g SLURM, Kubernetes) Experience with scientific and technical team collaborations, ensuring seamless coordination across teams and driving the successful delivery of technical projects Ability to deliver features meeting research user demands using Agile methodology An ongoing commitment to learning and staying at the forefront of AI/ML advancements. We understand that to successfully sustain and grow as a global enterprise and deliver for patients — we must ensure a diverse and inclusive work environment. Professional Certifications: SAFe for Teams certification (preferred) SAFe Scrum Master or similar (preferred) Soft Skills: Strong transformation and change management experience. Exceptional collaboration and communication skills. High degree of initiative and self-motivation. Ability to manage multiple priorities successfully. Team-oriented with a focus on achieving team goals. Strong presentation and public speaking skills. What you can expect of us As we work to develop treatments that take care of others, we also work to care for your professional and personal growth and well-being. From our competitive benefits to our collaborative culture, we’ll support your journey every step of the way. In addition to the base salary, Amgen offers competitive and comprehensive Total Rewards Plans that are aligned with local industry standards. Apply now and make a lasting impact with the Amgen team. careers.amgen.com As an organization dedicated to improving the quality of life for people around the world, Amgen fosters an inclusive environment of diverse, ethical, committed and highly accomplished people who respect each other and live the Amgen values to continue advancing science to serve patients. Together, we compete in the fight against serious disease. Amgen is an Equal Opportunity employer and will consider all qualified applicants for employment without regard to race, color, religion, sex, sexual orientation, gender identity, national origin, protected veteran status, disability status, or any other basis protected by applicable law. We will ensure that individuals with disabilities are provided reasonable accommodation to participate in the job application or interview process, to perform essential job functions, and to receive other benefits and privileges of employment. Please contact us to request accommodation.
Posted 10 hours ago
1.0 - 2.0 years
0 Lacs
Bengaluru, Karnataka, India
On-site
About Us Hiver gives teams the simplest way to deliver outstanding and personalized customer service. As a customer service solution built on Gmail, Hiver is intuitive, super easy to learn, and delightful to use. Hiver is used by thousands of teams at some of the best-known companies in the world to provide attentive, empathetic, and human service to their customers at scale. We’re a top-rated product on G2 and rank very highly on customer satisfaction. At Hiver, we obsess about being world-class at everything we do. Our product is loved by our customers, our content engages a very wide audience, our customer service is one of the highest rated in the industry, and our sales team is as driven about doing right by our customers as they are by hitting their numbers. We’re profitably run and are backed by notable investors. K1 Capital led our most recent rou of $27 million. Before that, we raised from Kalaari Capital, Kae Capital, and Citrix Startup Accelerator. Opportunity As a Software Development Engineer I (Backend) at Hiver, you will play a pivotal role in building our technical landscape and learning with our team while contributing to our vibrant company culture. With our customer base expanding rapidly and processing over 5 million emails daily for thousands of active users, you will be at the forefront of creating and enhancing the user experience. In this role, you will work with a team of highly skilled engineers while providing critical technical direction to our back-end development efforts. You will also have the exciting opportunity to tackle intricate technical challenges, including optimizing our architecture to handle the surging email volume and establishing a framework to monitor and enhance the performance of our back-end systems. You Will Be Working On Designing, building, and maintaining scalable backend systems and APIs for our B2B SaaS platform. Writing clean, maintainable, and efficient code while following software engineering best practices. Collaborating closely with cross-functional teams including Product, Design, and QA to deliver high-impact features. Taking ownership of small to medium-sized modules and features, from ideation to production. Debugging and resolving production issues, and continuously improving system performance and reliability. Participating in code reviews and sharing knowledge with peers to foster a strong engineering culture. We Are Looking For 1-2 years of experience in software development with strong fundamentals in data structures, algorithms, and object-oriented programming. Proficiency in at least one backend language like Python hands-on experience with modern frameworks. Exposure to relational databases (MySQL with a solid understanding of SQL), Redis, Queues & Caching techniques Familiarity with REST APIs, microservices architecture, and cloud platforms (AWS) A passion for solving real-world customer problems through technology. Good communication skills and the ability to work effectively in a collaborative team environment. A learning mindset and enthusiasm to grow in a fast-paced SaaS startup environment. Show more Show less
Posted 10 hours ago
8.0 years
7 - 9 Lacs
Hyderābād
On-site
Job description Some careers shine brighter than others. If you’re looking for a career that will help you stand out, join HSBC and fulfil your potential. Whether you want a career that could take you to the top, or simply take you in an exciting new direction, HSBC offers opportunities, support and rewards that will take you further.HSBC is one of the largest banking and financial services organisations in the world, with operations in 64 countries and territories. We aim to be where the growth is, enabling businesses to thrive and economies to prosper, and, ultimately, helping people to fulfil their hopes and realise their ambitions. We are currently seeking an experienced professional to join our team in the role of Software Engineer – Mainframe Testing. In this role, you will need: Must have strong experience in Mainframe technology (COBOL, DB2, CICS, JCL, VSAM etc.) to Involve in System Testing, SIT, Functional and Regression Testing Test Cases and its execution. Need to Create, Run & execute JCLs in Mainframe, Validate the job status in spool, and have problem solving skills to resolve JCL errors, experience in writing in complex SQL Queries run Mainframe DB post execution. Identify and escalate risks and issues to project management and managers, Daily status progress reporting to project stakeholders, senior management, attend project calls and meetings and support e2e SIT, UAT and Penny Test Well versed with the different testing methodologies (waterfall, Scrum, Agile etc.), techniques phases tools and can use the same as per the project requirement. Execute tests, track and monitor defects/Stories to closure and produce test metrics and test reports using GitHub / Jenkins / using Automation tool etc. Must have excellent oral and written communications skills, ability to accept and deliver assistance to meet deadlines, self-directed, organized capable of multi-tasking. Should have good analytical and problem-solving skills. Should have experience in creating reports in JIRA, sharepoint and confluence etc. Leadership and Guidance Must have experience in creating test estimation, test strategy and assisting associates to create Test Plan. Should be providing regular updates to management while managing day-to-day testing activities, such as test monitoring and status reporting at the end of the testing day. Nurture team members on mainframe and fucntional skills and expertise e.g., REXX, JCL, COBOL, DB2, VSAM and CICS, Core Banking (Lending products, Product ledger management, Fees and interest etc.) Assign tasks based on individual skills and workload, ensuring efficient use of resources Track team progress and individual contributions against goals and deadlines Requirements The successful candidate will also meet the following requirements: Software Engineer in Testing with extensive Mainframe development and Testing experience with 8 + Years minimum experience and should be an SME in Mainframe Testing (zSeries) – JCL, DB2, COBOL, CICS, Endeavour, File Manager, Debugging the programs. Knowledge and usage of tools File Manager/File Aid, Xpeditor, Endevor and SPUFI Knowledge in Unsecured and secured lending area in Banking products e.g. Personal loans, Consumer loans, Mortgage, BACS (Direct Debit / Direct Credit) payments Should be able to view and read the JCLs to the testing environment using production JCLs. Should be able to analyze the COBOL, DB2 code when required and analyze abends from spool and usage of debugging tools. Able to create Test Scenarios, Test cases, Test Cases Execution using TOSCA, ALM JIRA and Quality Centre on mainframe-based systems and applications. Excellent Communication Skills TOSCA, GitHub, Jenkins, CICD Pipeline, ALM JIRA etc. - experience good to have. Cloud, GCP, Performance Testing (Jmeter, Performance center, Appdynamics) – Experience good to have Other Retail /wholesale Banking Experience – good to have. You’ll achieve more when you join HSBC. www.hsbc.com/careers HSBC is committed to building a culture where all employees are valued, respected and opinions count. We take pride in providing a workplace that fosters continuous professional development, flexible working and opportunities to grow within an inclusive and diverse environment. Personal data held by the Bank relating to employment applications will be used in accordance with our Privacy Statement, which is available on our website. Issued by – HSBC Software Development India
Posted 10 hours ago
5.0 years
5 - 8 Lacs
Hyderābād
Remote
About the Team At Uber, we reimagine the way the world moves for the better. There are several operations and technologies that enable this mission, and Uber's AI Solutions organization leads many of those capabilities, such as data annotation for AI/ML innovation, app testing, localization/internationalization, map editing, digitization programs, data services, etc, for all Uber Lines of Businesses. We combine technology and human intelligence optimally to run scaled programs. The tech+ops solution, coupled with Uber's strength of building a platform for flexible work, will enable enterprises worldwide to accelerate their data, AI, and product journeys. While we do this, we look forward to creating flexible earnings opportunities through online tasks for millions of people across the world. Together, our tech, operations expertise, and platform for knowledge work are uniquely positioned to be the best-in-class human-in-the-loop solution for the industry. We are building this new business line and now offer our solutions to businesses of all sizes, all across the globe. With this, our focus is to Reimagine the way the world works". We are always looking for ways to better serve and engage our gig workers and enterprise customers. To do so, we bring the best of Uber by collaborating across multiple teams/orgs and tapping into the power of the Uber core platforms and network. About the Role As a Program manager on the team, you will be responsible for ensuring that the team complies with Uber's fiscal, business & legal policies for a new line of business. The ideal candidate for this role should have strong program management skills and analytical skills, be extremely well organized, and be able to clearly communicate and present information to drive better decisions/results. What You'll Need Manage the budget and forecasting processes with stakeholders for a new line of business. Establish and maintain a unified data source for all customer revenue and expenses at the project level. Optimize and automate the invoice validation process for customer revenue and expenses. Establish a weekly reporting process for revenue, expenses and margins to senior leadership. Collaborate closely with all stakeholders and manage a dashboard for all customer reporting. Develop and track key performance indicators (KPIs) and metrics to generate cost insights and areas of optimization. Ensure compliance with all financial regulations, tax, legal requirements, and internal policies. Independently identify issues, structure and conduct analyses, and form conclusions with minimal guidance. What You'll Need 5-7 years of work experience in program management, business analysis, data analysis, managing budget process, or related experience. Bachelor's degree in Business, Data Analytics, or other quantitative focus. Basic knowledge of GAAP and advanced SQL proficiency . Experience with developing automations and AI for optimizing processes. Excellent analytical skills, logical and structured thinking, and creative problem-solving. I nfluence and communicate with decision makers and outstanding written and verbal communication skills. Excellent organizational skills to juggle many tasks without losing sight of the highest priority items. Ability to understand complex concepts and make reasoned decisions with sometimes imperfect data. Ability to work with remote teams and across time zones to develop strategies and foster a cohesive and creative work environment.
Posted 10 hours ago
0 years
2 - 3 Lacs
Hyderābād
On-site
Ready to shape the future of work? At Genpact, we don’t just adapt to change—we drive it. AI and digital innovation are redefining industries, and we’re leading the charge. Genpact’s AI Gigafactory, our industry-first accelerator, is an example of how we’re scaling advanced technology solutions to help global enterprises work smarter, grow faster, and transform at scale. From large-scale models to agentic AI, our breakthrough solutions tackle companies’ most complex challenges. If you thrive in a fast-moving, tech-driven environment, love solving real-world problems, and want to be part of a team that’s shaping the future, this is your moment. Genpact (NYSE: G) is an advanced technology services and solutions company that delivers lasting value for leading enterprises globally. Through our deep business knowledge, operational excellence, and cutting-edge solutions – we help companies across industries get ahead and stay ahead. Powered by curiosity, courage, and innovation, our teams implement data, technology, and AI to create tomorrow, today. Get to know us at genpact.com and on LinkedIn, X, YouTube, and Facebook. We are inviting applications for the role of Process Associate, Developer Support Engineering (DSE) The Developer Support Engineering (DSE) team ensures global support coverage to our software developers’ community, coding and maintaining applications and databases. The team comes from a variety of Information Technology backgrounds, from entrepreneurs to full stack developers. Although our backgrounds are different, we all have one thing in common, the desire to help the software developer’s community all around the world through the software development of innovative new tools, which help make engaging with external developers a more connected and smoother experience. As a Developer Support Engineer, you will be working closely with Software Engineers, providing coding and technical support to the developers, helping them to ensure a high-quality experience to their users. We are looking for someone who is passionate about coding and solving problems. This role is perfect for someone who has been in the Information Technology (IT) industry providing technical support and wants to take on a new challenge. Responsibilities: Develop and maintain various products/software’s by using APIs, SDKs, and platform plugins. Software coding and troubleshooting issues with PHP, Python and JavaScript coding languages. Query and maintain SQL database tables. Manage technical product issues and escalations, delivering the highest level of customer satisfaction making use of software development methodologies, such as Lean and Six Sigma. Work closely with Software Engineer Developers to understand their needs and develop solutions. Understand and analyse High End Metrics as productivity, utilisation, Turn Around Time, Transfer & Escalation Rates, etc. Perform data analysis with a Use Case submission to visualise trends, provide solutions and mitigate issues. Ramp-up and train new hires in the process including E2E knowledge base Management. Stakeholder Management - Work collaboratively with both Internal, External stakeholders. Responsible for team's operational metrics and will quo own with FLM and drive teams' knowledge. Qualifications we seek in you Minimum qualifications Bachelor’s or equivalent in computer science or related field. Relevant experience providing Enterprise Support in a technical environment. Strong analytical/coding and communication skills. Ability to be flexible, multitask and learn in a fast-paced environment. Customer-focused and can demonstrate understanding and empathy. Creative problem solver with excellent troubleshooting skills. Self-driven nature with strong attention to detail and follow-through. Preferred qualifications Programming and scripting experience (PHP, Python, JavaScript). Experience working with APIs, plugins, and SQL database. Experience with tools such as Tableau, Unidash, Scuba, and Google Drive environment. Web Development experience. Lean & Six Sigma Methodologies Why join Genpact? Be a transformation leader – Work at the cutting edge of AI, automation, and digital innovation Make an impact – Drive change for global enterprises and solve business challenges that matter Accelerate your career – Get hands-on experience, mentorship, and continuous learning opportunities Work with the best – Join 140,000+ bold thinkers and problem-solvers who push boundaries every day Thrive in a values-driven culture – Our courage, curiosity, and incisiveness - built on a foundation of integrity and inclusion - allow your ideas to fuel progress Come join the tech shapers and growth makers at Genpact and take your career in the only direction that matters: Up. Let’s build tomorrow together. Genpact is an Equal Opportunity Employer and considers applicants for all positions without regard to race, color, religion or belief, sex, age, national origin, citizenship status, marital status, military/veteran status, genetic information, sexual orientation, gender identity, physical or mental disability or any other characteristic protected by applicable laws. Genpact is committed to creating a dynamic work environment that values respect and integrity, customer focus, and innovation. Furthermore, please do note that Genpact does not charge fees to process job applications and applicants are not required to pay to participate in our hiring process in any other way. Examples of such scams include purchasing a 'starter kit,' paying to apply, or purchasing equipment or training. Job Process Associate Primary Location India-Hyderabad Schedule Full-time Education Level Bachelor's / Graduation / Equivalent Job Posting Jun 19, 2025, 5:33:11 AM Unposting Date Ongoing Master Skills List Operations Job Category Full Time
Posted 10 hours ago
2.0 years
0 Lacs
India
Remote
Oliver Wyman DNA team is now looking to hire a Senior Data Analytics Specialist - we are looking for individuals with strong experience in Data Analytics and Private Capital Industry. OW DNA Overview The Oliver Wyman DNA is a distinguished center of excellence for business analytics and data analytics within Oliver Wyman. This group leverages data and information to provide business insights to Oliver Wyman consulting teams driving positive outcomes and tangible impact for Oliver Wyman’s clients. The group combines cutting edge data science expertise with core consulting expertise to augment engagement teams with analytical firepower to deliver outstanding results. Key Responsibilities: Deploy best-in-class analytics, statistical models, and research methods to solve complex business problems and generate impactful business reports Support due diligence and valuation projects for private equity clients, assisting in buy decisions, sell/IPO valuations, and post-transaction deal value realization Conduct thorough market, financial, and operational due diligence to support investment decisions and deal-value realization Develop and maintain financial models, valuation analyses, and data-driven insights using best-in-class analytics and AI techniques. Prepare clear, concise reports and presentations for internal teams and client stakeholders. Collaborate with senior team members to identify risks, growth opportunities, and value creation levers for clients Support business development efforts by gathering market intelligence and contributing to client proposals. Maintain and enhance data management and reporting tools leveraging MS Excel, PowerPoint, and other relevant software Education: Bachelor’s degree in Science, Finance, Mathematics, Economics or equivalent. MS or Certificate courses in analytics preferred Experience: Overall experience of 2+ years in data analytics, with minimum 1+ years of exposure to market research and/or due diligences Excellent analytical and problem-solving skills with proven ability to deliver actionable insights and proficiency in financial modelling and modelling techniques Knowledge and in-depth experience with customer research techniques (interviews, surveys, focus groups) Experience of doing business research across multiple sectors, preferably in a global consulting firm set up Strong written and verbal communication skills with demonstrated ability to interact effectively with all levels of stakeholders (both internal and external) Experience of working with specialized data sources such as Capital IQ , Factiva, Bloomberg etc. Advanced skills in MS-office, along with familiarity with Gen AI and other analytical tools preferred Strong experience in data analytics and visualization tools such as SQL, Python and PowerBI Quick learner with ability to learn and pick up a new tool/ platform quickly Oliver Wyman, a business of Marsh McLennan (NYSE: MMC), is a management consulting firm combining deep industry knowledge with specialized expertise to help clients optimize their business, improve operations and accelerate performance. Marsh McLennan is a global leader in risk, strategy and people, advising clients in 130 countries across four businesses: Marsh, Guy Carpenter, Mercer and Oliver Wyman. With annual revenue of $24 billion and more than 90,000 colleagues, Marsh McLennan helps build the confidence to thrive through the power of perspective. For more information, visit oliverwyman.com, or follow on LinkedIn and X. Marsh McLennan is committed to embracing a diverse, inclusive and flexible work environment. We aim to attract and retain the best people and embrace diversity of age, background, caste, disability, ethnic origin, family duties, gender orientation or expression, gender reassignment, marital status, nationality, parental status, personal or social status, political affiliation, race, religion and beliefs, sex/gender, sexual orientation or expression, skin color, or any other characteristic protected by applicable law. Marsh McLennan is committed to hybrid work, which includes the flexibility of working remotely and the collaboration, connections and professional development benefits of working together in the office. All Marsh McLennan colleagues are expected to be in their local office or working onsite with clients at least three days per week. Office-based teams will identify at least one “anchor day” per week on which their full team will be together in person.
Posted 10 hours ago
1.0 - 3.0 years
0 - 0 Lacs
Hyderābād
On-site
Job Information Date Opened 06/18/2025 Job Type Full time Industry Education Work Experience 1-3 years Salary ₹20,000 - ₹30,000 City Hyderabad State/Province Telangana Country India Zip/Postal Code 500001 About Us Fireblaze AI School is a part of Fireblaze Technologies which was started in April 2018 with a Vision to Up-Skill and Train in emerging technologies. Mission Statement “To Provide Measurable & Transformational Value To Learners Career” Vision Statement ““To Be The Most Successful & Respected Job-Oriented Training Provider Globally.” We Focus widely on creating a huge digital impact. Hence Our Strong Presence over Digital Platforms are a must have thing for use. Job Description Deliver engaging classroom and/or online training sessions on topics including: Python for Data Science Data Analytics using Excel and SQL Statistics and Probability Machine Learning and Deep Learning Data Visualization using Power BI / Tableau Create and update course materials, projects, assignments, and quizzes. Provide hands-on training and real-world project guidance. Evaluate student performance, provide constructive feedback, and track progress. Stay updated with the latest trends, tools, and technologies in Data Science. Mentor students during capstone projects and industry case studies. Coordinate with the academic and operations team for batch planning and feedback. Assist with the development of new courses and curriculum as needed. Requirements Bachelor’s or Master’s degree in Computer Science, Data Science, Statistics, or a related field. Proficiency in Python, SQL, and data handling libraries (Pandas, NumPy, etc.). Hands-on knowledge of machine learning algorithms and frameworks like Scikit-learn, TensorFlow, or Keras. Experience with visualization tools like Power BI, Tableau, or Matplotlib/Seaborn. Strong communication, presentation, and mentoring skills. Prior teaching/training experience is a strong advantage. Certification in Data Science or Machine Learning (preferred but not mandatory).
Posted 10 hours ago
5.0 - 7.0 years
8 - 10 Lacs
India
On-site
We need someone with hands-on exp, who is good in analysis, troubleshooting, debugging prod issues etc. Pls find the JD:::::: We are seeking a skilled MuleSoft Developer with 5–7 years of hands-on experience in developing and supporting integration solutions using Mule 4. The ideal candidate will have strong expertise in Anypoint Platform components including Studio, Runtime (preferably CloudHub 2.0), API Manager, and Exchange. This role involves working on development, production support, incident management, and performance optimization of MuleSoft applications. Key Responsibilities: Design, develop, and deploy integration solutions using MuleSoft (Mule 4) and Anypoint Studio. Work on CloudHub 2.0, API Manager, and Anypoint Exchange for managing and sharing APIs. Handle production support, resolve incidents, and troubleshoot critical issues in MuleSoft-based applications. Develop and consume REST/SOAP APIs, write complex DataWeave transformations, and define RAML/OAS specifications. Apply enterprise integration patterns and best practices to ensure scalable, maintainable solutions. Integrate with relational databases (Oracle, SQL Server) and cloud data platforms such as Snowflake, AWS, and Azure. Utilize CI/CD tools like Jenkins, Docker, and Kubernetes for automated builds and deployments. Monitor and log applications using tools such as Splunk, ELK, or New Relic. Analyze and resolve complex issues using thread dumps, heap dumps, and Mule logs. Required Skills: 5–7 years of hands-on MuleSoft development (Mule 4).Strong experience with Anypoint Studio, API Manager, CloudHub 2.0, and Exchange. Solid background in integration troubleshooting and production support. Proficiency with REST/SOAP, DataWeave, RAML/OAS, and integration design patterns. Working knowledge of Oracle, SQL Server, and cloud data solutions like Snowflake, AWS, or Azure. Experience with CI/CD and DevOps tools: Jenkins, Docker, Kubernetes. Familiarity with monitoring/logging tools: Splunk, ELK, New Relic. Strong debugging and performance tuning skills. Preferred Qualifications: MuleSoft certifications such as MCD – MuleSoft Certified Developer and/or MCIA – MuleSoft Certified Integration Architect. Job Type: Full-time Pay: ₹800,000.00 - ₹1,000,000.00 per year Location Type: In-person Schedule: Monday to Friday Experience: Anypoint Studio: 5 years (Required) Mulesoft Development: 5 years (Required) API Manager: 5 years (Required) CloudHub 2.0: 5 years (Required) Exchange: 5 years (Required) integration troubleshooting and production support.: 5 years (Required) REST/SOAP, DataWeave, RAML/OAS, Integration design patterns.: 5 years (Required) Oracle, SQL Server: 5 years (Required) Snowflake, AWS, or Azure: 5 years (Required) Jenkins, Docker, Kubernetes.: 5 years (Required) Splunk, ELK, New Relic: 5 years (Required) debugging and performance tuning skills: 5 years (Required) Work Location: In person
Posted 10 hours ago
3.0 years
0 Lacs
Bengaluru, Karnataka, India
On-site
We help the world run better At SAP, we enable you to bring out your best. Our company culture is focused on collaboration and a shared passion to help the world run better. How? We focus every day on building the foundation for tomorrow and creating a workplace that embraces differences, values flexibility, and is aligned to our purpose-driven and future-focused work. We offer a highly collaborative, caring team environment with a strong focus on learning and development, recognition for your individual contributions, and a variety of benefit options for you to choose from. What You'll Do We are looking for a skilled engineer to join our team focused on building application integrations via microservices and tooling. You will play a critical role in designing, developing, and deploying scalable solutions that streamline workflows and enable seamless integration across platforms. With expertise in Java, you will help drive innovation and operational efficiency within our organization. Application Development As part of our team, you will work on: Design and develop robust, scalable applications and interfaces for consuming platform services and tools Implement APIs and microservices to enable cross-platform integrations. Integration Solutions Build and maintain data and application integration solutions using Java. Work with diverse APIs and enterprise systems to ensure seamless data flow. Quality And Maintainability ensure high quality and maintainability with programming methods, asset reuse, and large-scale patterns What You Bring Degree in Computer Science, Data Science, Business informatics or related field with over 3 years’ experience Proficient in Java for backend development . Experience with RESTful APIs for integration services Experience with SQL for querying and transforming data. Understanding of developing applications/services and deploying them in BTP or similar SaaS platforms. Familiarity with SAP Build and CAP framework is an advantage Experience with SAP Analytics Cloud and SAP Datasphere or similar analytics/data warehousing environments is a plus Experience with Databricks and/or Spark would be desirable Solid technical background with ability to execute independently and share best practices with others Understanding of the fundamentals in data management, data engineering or data visualization topics are a bonus Ability to quickly learn new areas such as SAP Business Applications (S/4HANA, HCM, ISBN, CX) Meet your team Our team is delivering a single platform to simplify our customers’ data management, analytics, planning and AI needs. We establish data engineering workloads on top of critical SAP business applications through semantically unified data access across the SAP portfolio and Databricks. Finally, we enable our customers to unlock the value of all this data with unique Insight Apps directly integrated to the data stack. Bring out your best SAP innovations help more than four hundred thousand customers worldwide work together more efficiently and use business insight more effectively. Originally known for leadership in enterprise resource planning (ERP) software, SAP has evolved to become a market leader in end-to-end business application software and related services for database, analytics, intelligent technologies, and experience management. As a cloud company with two hundred million users and more than one hundred thousand employees worldwide, we are purpose-driven and future-focused, with a highly collaborative team ethic and commitment to personal development. Whether connecting global industries, people, or platforms, we help ensure every challenge gets the solution it deserves. At SAP, you can bring out your best. We win with inclusion SAP’s culture of inclusion, focus on health and well-being, and flexible working models help ensure that everyone – regardless of background – feels included and can run at their best. At SAP, we believe we are made stronger by the unique capabilities and qualities that each person brings to our company, and we invest in our employees to inspire confidence and help everyone realize their full potential. We ultimately believe in unleashing all talent and creating a better and more equitable world. SAP is proud to be an equal opportunity workplace and is an affirmative action employer. We are committed to the values of Equal Employment Opportunity and provide accessibility accommodations to applicants with physical and/or mental disabilities. If you are interested in applying for employment with SAP and are in need of accommodation or special assistance to navigate our website or to complete your application, please send an e-mail with your request to Recruiting Operations Team: Careers@sap.com For SAP employees: Only permanent roles are eligible for the SAP Employee Referral Program, according to the eligibility rules set in the SAP Referral Policy. Specific conditions may apply for roles in Vocational Training. EOE AA M/F/Vet/Disability Qualified applicants will receive consideration for employment without regard to their age, race, religion, national origin, ethnicity, age, gender (including pregnancy, childbirth, et al), sexual orientation, gender identity or expression, protected veteran status, or disability. Successful candidates might be required to undergo a background verification with an external vendor. Requisition ID: 427044 | Work Area: Software-Design and Development | Expected Travel: 0 - 10% | Career Status: Professional | Employment Type: Regular Full Time | Additional Locations: . Show more Show less
Posted 10 hours ago
3.0 years
6 - 9 Lacs
Hyderābād
On-site
Summary Job Description Summary Provide analytical support to Novartis key stakeholders to support decision-making processes. Support and facilitate data enabled decision making for Novartis internal customers by providing and communicating qualitative and quantitative analytics. Generate reports that supervise product metrics, progress, and KPIs. This role requires a blend of business insight and technical understanding, enabling you to collaborate with brand teams, marketing teams and all functions to maximize value. -Technical requirements - SQL, Dataiku, Python is a plus About the Role Key Responsibility: Solid understanding of multiple datasets (e.g. LAAD, Xponent, DDD) and managing and coordinating data sets from databases to find patterns and trends. Redefining these complex and granular data into actionable insights. Responsible for standard and ad-hoc extracts/reports across multiple primary and secondary data sources. Responsible for tracking ongoing outcomes reports and manage priorities for upcoming reports. Sharing findings with partners with reports and presentations on a timely basis. Putting together specifications to extract/transform data into required formats for different analytical elements using programming languages like SQL or other data processing tools. Build the foundation for more sophisticated approaches to APLD analysis and advanced analytics wherever it is required and beneficial. Establish and maintain positive relationships with key functional partners. Essential Requirements : Ability to work independently and as an integral member of the team and Attention to detail and quality focused, good interpersonal and communication skills, influence, negotiation and tact skills, innovative, and collaborative behaviors and “can-do” orientation. Curiosity and strong analytical thinking, verbal and written communication skills and exposure to working in multifunctional/cultural environment. Good communication and interpersonal skills. Conceptual, analytical & tactical thinking, strategic thought process. Ability to multi-task, work in a demanding distributed team environment, work under tight deadlines. Develop and maintain strong individual performance. Desirable Requirements : Masters or Bachelor’s in STEM At least 3+ years of experience in data modeling and reporting solutions development and hands-on experience of APLD and US national and subnational datasets and ability to lead teams functionally. Technical abilities: Excel, SQL or Dataiku, and PowerPoint is vital. Knowledge of statistical modeling or ML is a plus. Why Novartis: Helping people with disease and their families takes more than innovative science. It takes a community of smart, passionate people like you. Collaborating, supporting and inspiring each other. Combining to achieve breakthroughs that change patients’ lives. Ready to create a brighter future together? https://www.novartis.com/about/strategy/people-and-culture Join our Novartis Network: Not the right Novartis role for you? Sign up to our talent community to stay connected and learn about suitable career opportunities as soon as they come up: https://talentnetwork.novartis.com/network Benefits and Rewards: Read our handbook to learn about all the ways we’ll help you thrive personally and professionally: https://www.novartis.com/careers/benefits-rewards Division US Business Unit Universal Hierarchy Node Location India Site Hyderabad (Office) Company / Legal Entity IN10 (FCRS = IN010) Novartis Healthcare Private Limited Functional Area Marketing Job Type Full time Employment Type Regular Shift Work No Accessibility and accommodation Novartis is committed to working with and providing reasonable accommodation to individuals with disabilities. If, because of a medical condition or disability, you need a reasonable accommodation for any part of the recruitment process, or in order to perform the essential functions of a position, please send an e-mail to [email protected] and let us know the nature of your request and your contact information. Please include the job requisition number in your message. Novartis is committed to building an outstanding, inclusive work environment and diverse teams' representative of the patients and communities we serve.
Posted 10 hours ago
4.0 years
6 - 9 Lacs
Hyderābād
Remote
Data Engineer II Hyderabad, Telangana, India + 2 more locations Date posted Jun 18, 2025 Job number 1829143 Work site Up to 50% work from home Travel 0-25 % Role type Individual Contributor Profession Software Engineering Discipline Data Engineering Employment type Full-Time Overview Microsoft is a company where passionate innovators come to collaborate, envision what can be and take their careers further. This is a world of more possibilities, more innovation, more openness, and the sky is the limit thinking in a cloud-enabled world. Microsoft’s Azure Data engineering team is leading the transformation of analytics in the world of data with products like databases, data integration, big data analytics, messaging & real-time analytics, and business intelligence. The products our portfolio include Microsoft Fabric, Azure SQL DB, Azure Cosmos DB, Azure PostgreSQL, Azure Data Factory, Azure Synapse Analytics, Azure Service Bus, Azure Event Grid, and Power BI. Our mission is to build the data platform for the age of AI, powering a new class of data-first applications and driving a data culture. Within Azure Data, the Microsoft Fabric platform team builds and maintains the operating system and provides customers a unified data stack to run an entire data estate. The platform provides a unified experience, unified governance, enables a unified business model and a unified architecture. The Fabric Data Analytics, Insights, and Curation team is leading the way at understanding the Microsoft Fabric composite services and empowering our strategic business leaders. We work with very large and fast arriving data and transform it into trustworthy insights. We build and manage pipelines, transformation, platforms, models, and so much more that empowers the Fabric product. As an Engineer on our team your core function will be Data Engineering with opportunities in Analytics, Science, Software Engineering, DEVOps, and Cloud Systems. You will be working alongside other Engineers, Scientists, Product, Architecture, and Visionaries bringing forth the next generation of data democratization products. We do not just value differences or different perspectives. We seek them out and invite them in so we can tap into the collective power of everyone in the company. As a result, our customers are better served. Qualifications Required /Minimum Qualifications Bachelor's Degree in Computer Science, Math, Software Engineering, Computer Engineering, or related field AND 4+ years' experience in business analytics, data science, software development, data modeling or data engineering work o OR Master's Degree in Computer Science, Math, Software Engineering, Computer Engineering, or related field AND 2+ years' experience in business analytics, data science, software development, or data engineering work o OR equivalent experience 2+ years of experience in software or data engineering, with proven proficiency in C#, Java, or equivalent 2+ years in one scripting language for data retrieval and manipulation (e.g., SQL or KQL) 2+ years of experience with ETL and data cloud computing technologies, including Azure Data Lake, Azure Data Factory, Azure Synapse, Azure Logic Apps, Azure Functions, Azure Data Explorer, and Power BI or equivalent platforms Preferred/Additional Qualifications 1+ years of demonstrated experience implementing data governance practices, including data access, security and privacy controls and monitoring to comply with regulatory standards. Other Requirements Ability to meet Microsoft, customer and/or government security screening requirements are required for this role. These requirements include, but are not limited to the following specialized security screenings: Microsoft Cloud Background Check: This position will be required to pass the Microsoft Cloud background check upon hire/transfer and every two years thereafter. Equal Opportunity Employer (EOP) #azdat #azuredata #fabricdata #dataintegration #azure #synapse #databases #analytics #science Responsibilities You will develop and maintain data pipelines, including solutions for data collection, management, transformation, and usage, ensuring accurate data ingestion and readiness for downstream analysis, visualization, and AI model training You will review, design, and implement end-to-end software life cycles, encompassing design, development, CI/CD, service reliability, recoverability, and participation in agile development practices, including on-call rotation You will review and write code to implement performance monitoring protocols across data pipelines, building visualizations and aggregations to monitor pipeline health. You’ll also implement solutions and self-healing processes that minimize points of failure across multiple product features You will anticipate data governance needs, designing data modeling and handling procedures to ensure compliance with all applicable laws and policies You will plan, implement, and enforce security and access control measures to protect sensitive resources and data You will perform database administration tasks, including maintenance, and performance monitoring. You will collaborate with Product Managers, Data and Applied Scientists, Software and Quality Engineers, and other stakeholders to understand data requirements and deliver phased solutions that meet test and quality programs data needs, and support AI model training and inference You will become an SME of our teams’ products and provide inputs for strategic vision You will champion process, engineering, architecture, and product best practices in the team You will work with other team Seniors and Principles to establish best practices in our organization Embody our culture and values Benefits/perks listed below may vary depending on the nature of your employment with Microsoft and the country where you work. Industry leading healthcare Educational resources Discounts on products and services Savings and investments Maternity and paternity leave Generous time away Giving programs Opportunities to network and connect Microsoft is an equal opportunity employer. All qualified applicants will receive consideration for employment without regard to age, ancestry, citizenship, color, family or medical care leave, gender identity or expression, genetic information, immigration status, marital status, medical condition, national origin, physical or mental disability, political affiliation, protected veteran or military status, race, ethnicity, religion, sex (including pregnancy), sexual orientation, or any other characteristic protected by applicable local laws, regulations and ordinances. If you need assistance and/or a reasonable accommodation due to a disability during the application process, read more about requesting accommodations.
Posted 10 hours ago
2.0 - 5.0 years
5 - 10 Lacs
Hyderābād
On-site
Required Skills: 2-5 years of experience in data engineering, analytics and data sciences. Experience in LLM and Gen AI. Proficiency in Python, R, SQL and experience with ML libraries and frameworks like Scikit-learn, NumPy, etc. Familiarity with ML Ops tools/platforms Familiarity with Docker and Kubernetes Proficiency in one or more visualization tools like Tableau etc. Experience engineering information out of massive, complex and, in some cases, unstructured datasets. Ability to apply a strong business sense with technical skills to effectively balance decisions around the complexity and speed of the project delivery. Strong written, verbal, and interpersonal communication skills. Ability to effectively communicate at all levels in the organization. Ability to self-start and self-direct work in an unstructured environment, comfortable dealing with ambiguity. Excellent problem-framing, problem solving and project management skills and ability to change direction quickly. Ability to balance and prioritize multiple projects. Experience working within a Cloud based environment, SaaS. Experience with git and version control workflows. Proficient in performance tuning and debugging Requirements: As a Data Scientist, you will collaborate with a multi-disciplinary team of solution architects and data engineers on a wide range of business problems. You will be an integral part of IT Advanced Analytics group who are a team responsible for building out capabilities across business strategy, analytics, and Cloud. Data Scientist must be able to: Execute on all phases of the Data Science project lifecycle with minimal supervision Interact with business stakeholders to gather requirements and convey project outcomes Job Responsibilities: Be equal member of a cohesive and selfless team. Take complete ownership of your work with the goal of exceeding customer expectations. Work closely with analysts, developers, and data architects to ensure development meets requirements and delivers optimal performance. Work closely with internal WWT business, engineering and technology teams Contribute on all the stages of data science projects: from performing raw data mining to translating complex technical topics into business solutions. Maintains and enhance a set of critical data models supporting our business use cases. Maintains complex data pipeline supporting our team's mission in democratizing data and enabling a data driven organization, partnering with our data engineering teams. Effectively communicate actionable insights at all levels of the organization. Collaborate closely with stakeholders to improve our view of modeling and decision engines. Solve complex problems using advanced mathematical modeling and optimization techniques, including but not limited to, big data pre-processing, problem formulation, features engineering, algorithmic selection and evaluation, hyperparameter tuning for machine learning, and deployment. Build and Maintain models for internal customers and business teams, build knowledge and metrics for the product life cycle. Flexibility to work as a member of a matrix based diverse and geographically distributed project teams. Enhance the subject matter expertise while working with the various business domains.
Posted 10 hours ago
4.0 - 6.0 years
5 - 18 Lacs
Hyderābād
On-site
Job summary: 1. 4-6 years of solid experience in SQL preferably Teradata. 2. 1-2 years of basic programming skills 3. And basic analyst and data quality check skills (manual testing) 4. Good to have Airflow, DataStage basic understanding 5. Good to have cloud basic understanding. Job Type: Full-time Pay: ₹500,298.14 - ₹1,850,039.92 per year Work Location: In person
Posted 10 hours ago
5.0 years
0 Lacs
Mayur Vihar, Delhi, India
On-site
About Devdoot Devdoot is India’s next-generation health-tech platform dedicated to providing real-time, on-demand emergency and healthcare services. From ambulances and diagnostics to medicine delivery and doctor discovery, we are building a seamless and scalable digital healthcare ecosystem. As we scale across cities, we are looking to strengthen our tech team with a skilled FastAPI Backend Developer. Role Overview We are seeking a backend developer with strong proficiency in Python and hands-on experience with FastAPI to build and maintain scalable APIs and backend systems. You will work closely with our frontend, DevOps, and product teams to implement core platform features, ensure high availability, and improve system performance. Key Responsibilities Develop, test, and maintain backend services and APIs using FastAPI Design scalable architectures for real-time healthcare service delivery Implement user authentication, authorization, and secure data handling Work with databases such as PostgreSQL, MongoDB, or MySQL Integrate third-party services, APIs, and partner platforms Write clean, efficient, and well-documented code Collaborate with the frontend and DevOps teams for smooth deployment Key Skills and Requirements Strong command of Python 3.x and FastAPI framework Experience with RESTful API design and microservices architecture Working knowledge of SQL (PostgreSQL/MySQL) and NoSQL (MongoDB) Understanding of asynchronous programming, background tasks, and event queues Familiarity with containerization tools like Docker and version control systems like Git Exposure to deployment on cloud platforms (AWS, GCP, DigitalOcean) is a plus Basic understanding of React.js or Node.js is a bonus Strong problem-solving skills and ability to work in a fast-paced environment Preferred Qualifications Bachelor’s degree in Computer Science, Engineering, or a related field 2–5 years of experience in backend development Experience in health-tech or mission-critical service platforms is preferred Show more Show less
Posted 10 hours ago
0 years
5 - 18 Lacs
Hyderābād
On-site
Primary Skills: ReactJS/Node JS, SQL, MUI Secondary Skills: Azure Skills , Azure App Services,, Service Bus, Storage Blob, KeyVault, Terraform, Azure Functions using any language Cosmos DB, Developer Skills/Tools Github or any other source control/cicd, Rally VS Code and Postman Job Type: Full-time Pay: ₹500,298.14 - ₹1,850,039.92 per year Work Location: In person
Posted 10 hours ago
1.0 years
0 - 0 Lacs
Hyderābād
On-site
Job Title: Quality Analyst Location: Hyderabad (Onsite) Experience: 1 to 4 Years Employment Type: Full-Time (Note: Immediate joiner/15 days) Role Overview: We are hiring a Quality Analyst with experience in both manual and automation testing. The candidate must have a deep understanding of software quality processes, be detail-oriented, and capable of working in fast-paced Agile environments. Key Responsibilities: Prepare detailed test plans, test cases, and test scripts Perform Functional, Regression, Smoke, UI/UX, and Cross-browser Testing Log and track bugs using JIRA or similar tools Conduct API Testing using Postman Develop & maintain automation test scripts using Selenium / TestNG / Java or Python Participate in Agile ceremonies like sprint planning and daily standups Execute SQL queries for database validation Work closely with developers and product managers to ensure high-quality releases Integrate test scripts into CI/CD pipelines (Git, Jenkins) Required Skills: 1 to 4 years of experience in Manual + Automation Testing Strong knowledge of STLC, SDLC, and Defect Life Cycle Proficiency in Selenium WebDriver, TestNG, and Postman Java or Python scripting for automation Working experience with bug tracking tools like JIRA Basic understanding of Git and CI tools like Jenkins Good communication and problem-solving skills Knowledge of SQL for backend/data validation Good to Have (Not Mandatory): Exposure to Cypress / Playwright / REST Assured Experience in Mobile App Testing (Android/iOS) ISTQB Certification Familiarity with Performance Testing Tools like JMeter Educational Qualification: B.E./B.Tech/MCA or equivalent in Computer Science / IT / Related field About company Welcome to Bizionic, a leading software development and marketing company that empowers businesses to thrive digitally. With a comprehensive suite of services, we combine cutting-edge software development expertise with strategic marketing solutions to help our clients achieve their goals and outshine their competition. Partnering with Bizionic means gaining a dedicated team that is passionate about your success. We work collaboratively, keeping you informed and involved throughout the entire process. Our commitment to delivering on time, within budget, and exceeding your expectations remains unwavering. Embrace the power of integrated software development and marketing with Bizionic. Contact us today to embark on a transformative journey to elevate your brand, expand your reach, and accelerate your business growth. Let's pave the way for digital success in an ever-evolving market together. Bizionic T&C applies. Job Types: Full-time, Permanent Pay: ₹15,210.92 - ₹35,303.03 per month Benefits: Food provided Provident Fund Schedule: Day shift Work Location: In person
Posted 10 hours ago
3.0 years
0 Lacs
Telangana
On-site
Job Title: Developer Experience : 3-7 years No. of positions: 1 Responsible for understanding the requirements and perform data analysis. Responsible for setup of Microsoft fabric and its components Building secure, scalable solutions across the Microsoft Fabric platform. Create and manage Lakehouses. Implement Data Factory processes for scalable ETL and data integration. Design, implement and manage comprehensive Data Warehousing solutions for analytics using fabric Creating and scheduling data pipelines using Azure data factory Building robust data solutions using Microsoft data engineering tools. Create and manage Power BI reports and semantic models Write and optimize complex SQL queries to extract and analyze data, ensuring data processing and accurate reporting. Work closely with customers, business analysts and technology & project team to understand business requirements, drive the analysis and design of quality technical solutions that are aligned with business and technology strategies and comply with the organization's architectural standards. Understand and follow-up through change management procedures to implement project deliverables. Coordinating with support groups to get the issues resolved in a quick turnaround time. Mandatory Bachelor’s degree in computer science or similar field or equivalent work experience. 3+ years of experience working in Microsoft Fabric. Expertise in working with OneLake and Lakehouses. Strong understanding of Power BI reports and semantic model using Fabric Proven record of building ETL and data solutions using Azure data factory. Strong understanding of data warehousing concepts and ETL processes. Hand on experience of building data warehouses in fabric. Strong skills in Python and PySpark Practical experience of implementing spark in fabric, scheduling spark jobs, writing spark SQL queries. Knowledge of real time analytics in fabric Experience of utilizing Data Activator for effective data asset management and analytics. Ability to flex and adapt to different tools and technologies. Strong learning attitude. Good written and verbal communication skills. Demonstrated experience of working in a team spread across multiple locations. Preferable Knowledge of AWS services Knowledge of snowflake Location: NOIDA Timings: 2:00 PM to 10:30 PM Cab Facility provided : Yes
Posted 10 hours ago
15.0 years
0 Lacs
Navi Mumbai, Maharashtra, India
On-site
Job Purpose: Primarily responsible for ensuring and managing Database Incident Management(L1), Problem Management(L2), Performance Management and Obsolescence Management and Project management (L3) of WBO applications and Government initiatives. This will cover areas like : Incident Management · Problem Management · Application Performance Management · Application Obsolescence Management · NSI - Project Management, Enhancements · Vendor Performance Management · Review of OS/DB Baseline and Privileged User ID on OS/DB · Contract Review · Change and Release Management · Budgeting · Review of Installation and creation of databases in application · Periodic review of Health check of DB Backup & Restoration · RAC installation procedures and configuration · Performance tuning · Review with Oracle Team on issues raised through Oracle SR and arranging fixes · Planning of database upgrade in UAT, Pre-Prod, Prod and DR · Upgrade testing using tools such as RAT and SPA. · Support during load testing, recommendations on performance issues during load test and other important projects · Database Health check review. Key Skillsets: Experience in administration of application server / middleware technologies like IBM WebSphere, Oracle WebLogic, IIS, etc. Development, Production Support experience on Java platforms and technologies like HTML 5 · Knowledge of cloud and containerization technologies like OpenShift is preferred Knowledge of Portal technologies, JSON, Rest and SOAP based API’s Knowledge of Integration technologies like XML, WSDL, SOA etc. Knowledge of Server and Storage Hardware Knowledge of Network and Network Security technologies like Firewall, SSL, etc. Experience of having worked with Operating Systems like Linux, Solaris, AIX, etc. Experience of having worked with Database technologies like Oracle DB, SQL Server, etc. · Knowledge of Government Business and UIDAI transaction flow Require good communication skills, positive attitude and preferably with certification in the technologies mentioned above Exposure to architectural concepts/design across diverse technology platforms. Strong quantitative skills with the ability to discern quality of information and patterns in data. Experience : 15+ years Show more Show less
Posted 10 hours ago
0 years
0 Lacs
Hyderābād
On-site
Develop and maintain Java-based microservices using Spring Boot Implement scalable backend systems and REST APIs Work with Kubernetes and Docker to deploy and manage containerized applications Integrate with Azure services (e.g., AKS, App Services, Blob Storage) Work with Kafka for event streaming, publish/subscribe messaging, and system integration Collaborate with cross-functional teams to implement features for loyalty programs, customer offers, and e-commerce flows, loyalty migration Write unit and integration tests, and participate in peer code reviews Follow best practices for coding, security, and performance Support troubleshooting and production issue resolution Required Skills: Strong experience with Java (11/17/21) and Spring Boot Hands-on experience building and maintaining microservices Solid understanding of Docker and Kubernetes for container orchestration Experience working with Azure Cloud (AKS, App Services, Functions, etc.) Practical knowledge of Kafka (producers, consumers, and streaming) Familiarity with SQL and NoSQL databases (e.g., PostgreSQL, MongoDB) Experience with CI/CD pipelines and version control (e.g., Git, Azure DevOps) Agile/Scrum development experience Domain Experience: Experience working on e-commerce or customer loyalty platforms Understanding of loyalty program mechanics like rewards, tiers, and customer engagement Nice to Have: Familiarity with monitoring/logging tools like Grafana, Prometheus, Azure Monitor Exposure to test automation frameworks
Posted 10 hours ago
0 years
3 - 7 Lacs
Hyderābād
On-site
- Bachelor's degree - Knowledge of Microsoft Office products and applications at an advanced level - Experience working in a large public accounting firm or multi-national corporate tax department Amazon is seeking a Tax Analyst to join the State & Local Audit team in Hyderabad, India. The Amazon Tax Department is a fast-paced, team-focused, and dynamic environment. This position will be primarily responsible for supporting sales & use tax audits as well as related indirect tax projects. Key job responsibilities - Prepare and review responses to audit inquiries - Retrieve and analyze data and supporting documentation responsive to audits and information requests - Collaborate with business and technical teams on process improvement initiatives A day in the life The SALT Audit team manages State and Local indirect and direct tax audit and controversy matters for Amazon. Our scope also includes a self-audit function, management of statutory credits / incentives and FAS5 for US indirect tax as well as unclaimed property compliance and recovery work. Knowledge of at least one data-focused technology tool, such as Python, SQL, Alteryx, Amazon QuickSight, or similar Self-starter with ability to prioritize tasks and independently define, implement, and manage creation of new processes Our inclusive culture empowers Amazonians to deliver the best results for our customers. If you have a disability and need a workplace accommodation or adjustment during the application and hiring process, including support for the interview or onboarding process, please visit https://amazon.jobs/content/en/how-we-hire/accommodations for more information. If the country/region you’re applying in isn’t listed, please contact your Recruiting Partner.
Posted 10 hours ago
Upload Resume
Drag or click to upload
Your data is secure with us, protected by advanced encryption.
SQL (Structured Query Language) is a crucial skill in the field of data management and analysis. In India, the demand for professionals with SQL expertise is on the rise, with numerous job opportunities available across various industries. Job seekers looking to break into the IT sector or advance their careers in data-related roles can benefit greatly from acquiring SQL skills.
These cities are known for their thriving IT industries and are hotspots for SQL job openings.
In India, the average salary range for SQL professionals varies based on experience levels. Entry-level positions can expect to earn around ₹3-5 lakhs per annum, while experienced professionals with 5+ years of experience can earn anywhere from ₹8-15 lakhs per annum.
A typical career progression in the SQL domain may include roles such as: - Junior SQL Developer - SQL Developer - Senior SQL Developer - Database Administrator - Data Analyst - Data Scientist
Advancing to higher roles like Tech Lead or Data Architect is possible with increased experience and expertise.
In addition to SQL proficiency, job seekers in India may benefit from having skills such as: - Data analysis and visualization tools (e.g., Tableau, Power BI) - Programming languages (e.g., Python, R) - Knowledge of database management systems (e.g., MySQL, Oracle) - Understanding of data warehousing concepts
Here are 25 SQL interview questions to help you prepare for job interviews in India:
As you explore SQL job opportunities in India, remember to not only focus on mastering SQL but also to develop related skills that can make you a well-rounded professional in the data management field. Prepare thoroughly for interviews by practicing common SQL questions and showcase your expertise confidently. Good luck with your job search!
Browse through a variety of job opportunities tailored to your skills and preferences. Filter by location, experience, salary, and more to find your perfect fit.
We have sent an OTP to your contact. Please enter it below to verify.