Jobs
Interviews

1203 Normalization Jobs - Page 21

Setup a job Alert
JobPe aggregates results for easy application access, but you actually apply on the job portal directly.

2.0 - 3.0 years

0 Lacs

India

On-site

Flexera saves customers billions of dollars in wasted technology spend. A pioneer in Hybrid ITAM and FinOps, Flexera provides award-winning, data-oriented SaaS solutions for technology value optimization (TVO), enabling IT, finance, procurement and cloud teams to gain deep insights into cost optimization, compliance and risks for each business service. Flexera One solutions are built on a set of definitive customer, supplier and industry data, powered by our Technology Intelligence Platform, that enables organizations to visualize their Enterprise Technology Blueprint™ in hybrid environments—from on-premises to SaaS to containers to cloud. We’re transforming the software industry. We’re Flexera. With more than 50,000 customers across the world, we’re achieving that goal. But we know we can’t do any of that without our team. Ready to help us re-imagine the industry during a time of substantial growth and ambitious plans? Come and see why we’re consistently recognized by Gartner, Forrester and IDC as a category leader in the marketplace. Learn more at flexera.com Responsibilities Respond to requests to investigate content related issues, identify root cause, and remediate Conduct research, investigate and collect data on software and hardware products from various sources and curate the data into the Data Platform Relate data points from 3rd party vendors and product suppliers and ensure consistency, quality, and accuracy in our normalized database Advise how data drives customer value, business use cases and decision making Identify, analyze, and interpret trends or patterns in complex data sets Track the latest information from the IT market and several other vertical markets (Medical, Finance and Banking) and update/maintain a comprehensive reference catalog with the most up-to-date information Operate with consistency, quality, and accuracy in relation to our Content Operations standards Communicate effectively with Support, Engineering and Product Management regarding enrichment, defects, data alignments and gap-fill requests Contribute to continuous improvement initiatives and update articles on our Flexera Knowledge Base Confidently promote our team’s principles across the organization Creating tooling and systems for better maintenance and monitoring of the content services Understanding the system, tool, application workflows and adopt towards the same Requirements To be successful the Content Ops will need to have some (if not all) of the following attributes: Engineering / MCA graduates with 2 - 3 years' experience in content Ops. Firm understanding of IT Asset Management (ITAM), software and hardware assemblies, version, edition, and release management Familiarity of software licensing including open source and vulnerabilities. Authoritative knowledge of at least one mainstream development language such as Java, Python, Go, or Javascript Familiar with stored procedures in SQL/Oracle Server (Should be clear with the data normalization – collect, interpret, analyze, and report) Familiarity with SaaS, PaaS and IaaS and Cloud products. Strong research skills, able to investigate, locate, and collect specific information quickly and accurately Strong reading comprehension skills, able to understand information found on web pages, product documentation, technical and marketing articles and extract specific content quickly and accurately Comfortable dealing with huge amount of data and able to organize them according to specific rules and patterns Familiar with ticketing tools such as Service Now, Salesforce and JIRA Knowledge of databases, API’s, SQL, scripting languages and process automation is advantageous Strong interpersonal skills, a team player The Following Personal Qualities Are Desired Passionate about self-learning and active in the (local or online) tech community Highly motivated with attention to detail and strong problem-solving skills Self-driven and prepared to go the extra mile when the team is up against it Achieves the right balance of confidence and respect Ability to communicate effectively and efficiently within and between teams Flexera is proud to be an equal opportunity employer. Qualified applicants will be considered for open roles regardless of age, ancestry, color, family or medical care leave, gender identity or expression, genetic information, marital status, medical condition, national origin, physical or mental disability, political affiliation, protected veteran status, race, religion, sex (including pregnancy), sexual orientation, or any other characteristic protected by local/national laws, policies and/or regulations. Flexera understands the value that results from employing a diverse, equitable, and inclusive workforce. We recognize that equity necessitates acknowledging past exclusion and that inclusion requires intentional effort. Our DEI (Diversity, Equity, and Inclusion) council is the driving force behind our commitment to championing policies and practices that foster a welcoming environment for all. We encourage candidates requiring accommodations to please let us know by emailing careers@flexera.com.

Posted 1 month ago

Apply

5.0 years

0 Lacs

Pune, Maharashtra, India

On-site

JOB DESCRIPTION “While we have our offices in Bangalore, Chennai, Hyderabad, Nagpur and Pune this position is hybrid with you being able to report to a location nearest to your current location, if the need arises.” We are looking for a highly experienced NLP Analyst with deep expertise in linguistic data analysis, annotation design, and production-scale NLP model evaluation. This role requires a blend of linguistic acumen, analytical rigor, and real-world application experience. You will drive the design and execution of NLP initiatives across diverse domains and guide cross-functional teams on best practices in language data handling and annotation quality. Key Responsibilities Manage large-scale text annotation and labeling pipelines for supervised and semi-supervised learning. Conduct advanced linguistic analysis of unstructured content (e.g., clinical notes, legal contracts, customer communications, claim documents) to identify patterns, gaps, and modeling opportunities. Define and enforce annotation schemas and QA protocols for complex NLP tasks (e.g., NER, relation extraction, coreference resolution, sentiment/intent classification). Evaluate and improve the performance of NLP models through rigorous error analysis and metric-driven feedback loops. Collaborate with ML/NLP engineers, data scientists, and domain experts to build robust NLP pipelines that scale across use cases. Lead internal research efforts on emerging NLP methodologies, including LLM prompt engineering, hybrid rule-learning approaches, and few-shot learning. Provide mentorship to junior analysts and contribute to developing internal NLP knowledge repositories and annotation standards. Required Qualifications Master’s in computational Linguistics, NLP, Data Science, Computer Science, or a related field. 5+ years of professional experience in NLP, with a strong track record of hands-on work in data annotation, language model evaluation, and NLP pipeline development. Expertise in Python and key NLP libraries (spaCy, NLTK, Scikit-learn, Hugging Face Transformers, etc.). Advanced proficiency in building and managing annotation workflows using tools like Prodigy, doccano, Brat, or in-house platforms. Deep understanding of linguistic structures (syntax, semantics, pragmatics) and their application to real-world NLP challenges. Experience evaluating ML/NLP models using metrics like F1, ROUGE, BLEU, precision/recall, and embedding-based similarity. Solid grasp of vectorization methods (TF-IDF, embeddings, transformer-based encodings) and modern language models (e.g., BERT, GPT, LLaMA). Preferred Qualifications Experience with domain-specific NLP (e.g., clinical/biomedical, legal, fintech). Knowledge of knowledge graph construction, relation extraction, and entity linking. Experience integrating structured/unstructured data for downstream AI/ML applications. Familiarity with prompt engineering for LLMs and tuning foundation models. Strong data querying and visualization skills (SQL, pandas, seaborn, Power BI/Tableau). Perficient is always looking for the best and brightest talent and we need you! We’re a quickly-growing, global digital consulting leader, and we’re transforming the world’s largest enterprises and biggest brands. You’ll work with the latest technologies, expand your skills, and become a part of our global community of talented, diverse, and knowledgeable colleagues. RESPONSIBILITIES Key Responsibilities Preprocess and clean raw text data for downstream NLP applications (e.g., tokenization, normalization, entity recognition). Annotate and label datasets for supervised learning tasks (e.g., intent classification, sentiment analysis, NER). Analyze and visualize linguistic patterns and insights from textual data. Work with data scientists to evaluate and improve model performance. Support the development of rule-based and machine learning-based NLP pipelines. Document and maintain guidelines for annotation and linguistic QA. Collaborate with stakeholders to understand domain-specific language challenges and requirements QUALIFICATIONS Required Qualifications Bachelor’s degree in Linguistics, Computer Science, Data Science, or a related field. 3+ years of hands-on experience with text analysis or NLP tasks. Proficiency in Python and common NLP libraries (e.g., spaCy, NLTK, pandas). Experience working with annotation tools (e.g., Prodigy, Labelbox, doccano). Strong understanding of language structure and linguistic features. Ability to apply regular expressions and text parsing techniques effectively. Familiarity with data visualization tools and basic statistics. Preferred Qualifications Experience in domain-specific NLP (e.g., clinical/biomedical, legal, financial). Knowledge of vectorization methods (TF-IDF, word2vec, BERT embeddings). Exposure to ML model evaluation metrics (e.g., precision, recall, F1 score). Experience with SQL and working with large datasets. Familiarity with LLMs (e.g., OpenAI, Hugging Face Transformers). Perficient full-time employees receive complete and competitive benefits. We offer a collaborative work environment, competitive compensation, generous work/life opportunities and an outstanding benefits package that includes paid time off plus holidays. In addition, all colleagues are eligible for a number of rewards and recognition programs including billable bonus opportunities. Encouraging a healthy work/life balance and providing our colleagues great benefits are just part of what makes Perficient a great place to work. WHO WE ARE Perficient is a leading global digital consultancy. We imagine, create, engineer, and run digital transformation solutions that help our clients exceed customers’ expectations, outpace competition, and grow their business. With unparalleled strategy, creative, and technology capabilities, our colleagues bring big thinking and innovative ideas, along with a practical approach to help our clients – the world’s largest enterprises and biggest brands succeed. WHAT WE BELIEVE At Perficient, we promise to challenge, champion, and celebrate our people. You will experience a unique and collaborative culture that values every voice. Join our team, and you’ll become part of something truly special. We believe in developing a workforce that is as diverse and inclusive as the clients we work with. We’re committed to actively listening, learning, and acting to further advance our organization, our communities, and our future leaders… and we’re not done yet. Perficient, Inc. proudly provides equal employment opportunities (EEO) to all employees and applicants for employment without regard to race, color, religion, gender, sexual orientation, national origin, age, disability, genetic information, marital status, amnesty, or status as a protected veteran in accordance with applicable federal, state and local laws. Perficient, Inc. complies with applicable state and local laws governing non-discrimination in employment in every location in which the company has facilities. This policy applies to all terms and conditions of employment, including, but not limited to, hiring, placement, promotion, termination, layoff, recall, transfer, leaves of absence, compensation, and training. Perficient, Inc. expressly prohibits any form of unlawful employee harassment based on race, color, religion, gender, sexual orientation, national origin, age, genetic information, disability, or covered veterans. Improper interference with the ability of Perficient, Inc. employees to perform their expected job duties is absolutely not tolerated. Disability Accommodations: Perficient is committed to providing a barrier-free employment process with reasonable accommodations for qualified individuals with disabilities and disabled veterans in our job application procedures. If you need assistance or accommodation due to a disability, please contact us. Disclaimer: The above statements are not intended to be a complete statement of job content, rather to act as a guide to the essential functions performed by the employee assigned to this classification. Management retains the discretion to add or change the duties of the position at any time. ABOUT US Perficient is always looking for the best and brightest talent and we need you! We’re a quickly growing, global digital consulting leader, and we’re transforming the world’s largest enterprises and biggest brands. You’ll work with the latest technologies, expand your skills, experience work-life balance, and become a part of our global community of talented, diverse, and knowledgeable colleagues. Select work authorization questions to ask when applicants apply 1. Are you legally authorized to work in the United States? 2. Will you now, or in the future, require sponsorship for employment visa status (e.g. H-1B visa status)?

Posted 1 month ago

Apply

0.0 - 1.0 years

7 - 8 Lacs

Hyderabad, Telangana

On-site

Greetings from Star Secutech Pvt Ltd!!! Huge welcome to Immediate Joiners!!! Job Title: SR Executive Reporting to: Team Leader/AM/DM Location: Hyderabad Working Hours/ Days: 9 Hours / 5 Days a Week. Shift: U.S Shift (5:30 PM – 2:30AM) Salary: 7-8 LPA (Negotiable) Job Role:  Data types: Identify the data types of each data set and ensure compatibility.  Harmonization process: Develop a harmonization process that outlines the steps required to harmonize data, such as data cleansing, normalization, and validation.  Disparate data sources: Consider data sources that may have different formats, such as databases, spreadsheets, and APIs. Develop methods to integrate and harmonize data from various sources.  Harmonization tools: Utilize various tools and technologies, such as extract, transform, load (ETL) tools, data integration platforms, and data cleansing software, to streamline the harmonization process.  Harmonization schema: Define a harmonization schema that standardizes the data structure, format, and terminology across different data sets. Interested candidates don't wait call or DM to 9087726632 to proceed further with interview & start working!!!! All the best!! Job Types: Full-time, Permanent Pay: ₹700,000.00 - ₹800,000.00 per year Benefits: Health insurance Leave encashment Paid sick time Paid time off Provident Fund Schedule: Evening shift Fixed shift Monday to Friday Night shift UK shift US shift Supplemental Pay: Performance bonus Shift allowance Yearly bonus Education: Bachelor's (Required) Experience: Pharmacobigilence: 1 year (Required) Location: Hyderabad, Telangana (Required) Shift availability: Night Shift (Required) Work Location: In person Application Deadline: 29/07/2025 Expected Start Date: 07/07/2025

Posted 1 month ago

Apply

14.0 years

0 Lacs

India

Remote

At Medtronic you can begin a life-long career of exploration and innovation, while helping champion healthcare access and equity for all. You’ll lead with purpose, breaking down barriers to innovation in a more connected, compassionate world. A Day in the Life Careers that Change Lives Principal Data Software Engineer in the Cardiac Rhythm Disease Management (CRDM) R&D Software Organization developing software supporting Medtronic implantable cardiac devices. The individual will operate in all phases and contribute to all activities of the software development process.Candidates must be willing to work in a fast paced, multi-tasking, team environment. A Day in the Life Design, Develop and test Software high integrity software for class II and III medical devices. Learn and understand software standards for Medical devices, ex. IEC62304. Define and implement software requirements and designs and review software developed by other team members. Contributes and applies advanced technical principles, theories, and concepts to solve complex technical problems. Participate in process improvement initiatives for the software team. This includes recognizing areas for improvement as well as working with others to develop and document process improvements. Demonstrate ownership of software feature/module and drive development of the feature/module through SDLC. Provide hands-on leadership, coaching, mentoring, and software engineering best practices to junior software engineers. Develop reusable patterns and encourage innovation that will increase team velocity. Maintain, improve and design new software tools. These tools use either scripting languages (Perl, Python), programming languages (Java, C, C#), or web technology (HTML5, JavaScript). Work under general direction and collaboratively with internal and external partners. Continuously keep updated with latest technology trends and channel that learning to Medtronic Product development Must Have Job Respobsibilities Experience in software design for medical devices. Hands on experience in developing implantable System Software components related to data acquisition, Real Time Data processing and data presentation. Experience in defining control system state machine for processing real time data and synchronizing real time data across different inputs. Applying industry standard best practices to develop system software complying to security requirements to ensure patient privacy and safety. Experience in developing Firmware and Device Drivers for embedded peripherals. Experience in developing simulators for simulating implantable device behavior through design patterns and architecture patterns. Hands on experience in Blue Tooth enabled device communication. Hands on experience in SVG Graphic based development. Hands on experience in Mobile Operating System apps development targeted at Class III Medical Systems. Strong oral and written communication skills Experience with configuration management tools Proficiency working in a team environment Demonstrated skills in writing engineering documents (specifications, project plans, etc) Must Have Minimum Qualification B.E/BTech.in Computer Science Engineering and 14+ years of experience (or ME/MTech in computers science and 12+ years) Strong programming skills in C#, .NET AND/OR C/C++Strong knowledge of software design, development, debug and test practices Apply best practices to develop software that’s driven by test first approach. Create automation protocols to test complex software stack for behavior and coverage. Provide design guidance for designing Networking Services (Web Services, SOAP and REST services) for communicating over TCP / UDP between Tablet and external Servers. Perform thorough analysis and synthesis of data at hand to apply relevant software engineering algorithms to provide best user experience for real time data representation. Should be able to design systems that comply to object oriented design patterns for scalability and extensibility. Should be able to analyze system requirements, map them to sub system requirements , create design and design artifacts using UML diagrams, provide traceability into Requirements, Should be able to understand Operating System thread priorities, thread scheduling concepts and apply those concepts to realize efficient and optimal flow of data through the system for real time data processing. Apply software engineering principles for requirement analysis, requirement prioritization, life cycle models such as waterfall, Agile. Should be able to understand Web Based applications design , remote procedure calls and distributed computing and apply those concepts to Product development. Should be able to understand concepts of relational data base management, normalization of tables, and design well normalized data base tables. Should be able to understand Socket communication and design/development of applications involving socket communication across process boundaries. Should be able to perform build system management through thorough understanding of compiler optimization, compiler design. Principal Working Relationship Reports to the Engineering Manager The Senior Software Engineer frequently interacts with Product Owner, Tech Lead, other developers, V&V engineers, internal partners and stakeholders concerning estimations, design, implementation or requirement clarifications, works closely with global sites. Nice to Haves 5+ years of experience in software design for medical devices Strong Leadership skills and mentoring capabilities Experience in mobile software development, ex. iOS, Android Experience in web based technologies, ex. HTML5, JavaScript, CSS or Cordova Experience in Microsoft Visual Studio development platforms/TFS/tools Experience in Open Source development platform/tools, ex. Eclipse Effectively communicate and operate within a cross-functional work environment. (Mechanical Engineering, Systems Engineering, Firmware Development, Software Development, Test Development, Manufacturing) Experience leading a software development team. Physical Job Requirements The above statements are intended to describe the general nature and level of work being performed by employees assigned to this position, but they are not an exhaustive list of all the required responsibilities and skills of this position. Benefits & Compensation Medtronic offers a competitive Salary and flexible Benefits Package A commitment to our employees lives at the core of our values. We recognize their contributions. They share in the success they help to create. We offer a wide range of benefits, resources, and competitive compensation plans designed to support you at every career and life stage. About Medtronic We lead global healthcare technology and boldly attack the most challenging health problems facing humanity by searching out and finding solutions. Our Mission — to alleviate pain, restore health, and extend life — unites a global team of 95,000+ passionate people. We are engineers at heart— putting ambitious ideas to work to generate real solutions for real people. From the R&D lab, to the factory floor, to the conference room, every one of us experiments, creates, builds, improves and solves. We have the talent, diverse perspectives, and guts to engineer the extraordinary.

Posted 1 month ago

Apply

5.0 - 9.0 years

9 - 10 Lacs

Hyderābād

On-site

About the Role: Grade Level (for internal use): 10 The Role: Senior Scrum Master The Team: The team is focused on agile product development offering insights into global capital markets and the financial services industry. This is an opportunity to be a pivotal part of our fast-growing global organization during an exciting phase in our company's evolution. The Impact: The Senior Scrum Master plays a crucial role in driving Agile transformation within the technology team. By facilitating efficient processes and fostering a culture of continuous improvement, this role directly contributes to the successful delivery of projects and enhances the overall team performance. What’s in it for you: Opportunity to lead and drive Agile transformation within a leading global organization. Engage with a dynamic team committed to delivering high-quality solutions. Access to professional development and growth opportunities within S&P Global. Work in a collaborative and innovative environment that values continuous improvement. Responsibilities and Impact: Facilitate Agile ceremonies such as sprint planning, daily stand-ups, retrospectives, and reviews. Act as a servant leader to the Agile team, guiding them towards continuous improvement and effective delivery. Manage scope changes, risks, and escalate issues as needed, coordinating testing efforts and assisting scrum teams with technical transitions. Support the team in defining and achieving sprint goals and objectives. Foster a culture of collaboration and transparency within the team and across stakeholders. Encourage and support the development of team members, mentoring them in Agile best practices. Conduct data analysis and create and interpret metrics for team performance tracking and improvement. Conduct business analysis and requirement gathering sessions to align database solutions with stakeholder needs. Collaborate with stakeholders to help translate business requirements into technical specifications. Ensure adherence to Agile best practices and participate in Scrum events. Lead initiatives to improve team efficiency and effectiveness in project delivery. What We’re Looking For: Basic Required Qualifications: Bachelor's degree in a relevant field or equivalent work experience. Minimum of 5-9 years of experience in a Scrum Master role, preferably within a technology team. Strong understanding of Agile methodologies, particularly Scrum and Kanban. Excellent communication and interpersonal skills. Proficiency in business analysis: Experience in gathering and analyzing business requirements, translating them into technical specifications, and collaborating with stakeholders to ensure alignment between business needs and database solutions. Requirement gathering expertise: Ability to conduct stakeholder interviews, workshops, and requirements gathering sessions to elicit, prioritize, and document business requirements related to database functionality and performance. Basic understanding of SQL queries: Ability to comprehend and analyze existing SQL queries to identify areas for performance improvement. Fundamental understanding of database structure: Awareness of database concepts including normalization, indexing, and schema design to assess query performance. Additional Preferred Qualifications: Certified Scrum Master (CSM) or similar Agile certification. Experience with Agile tools such as Azure DevOps, JIRA, or Trello. Proven ability to lead and influence teams in a dynamic environment. Familiarity with software development lifecycle (SDLC) and cloud platforms like AWS, Azure, or Google Cloud. Experience in project management and stakeholder engagement. Experience leveraging AI tools to support requirements elicitation, user story creation and refinement, agile event facilitation, and continuous improvement through data-driven insights. About S&P Global Market Intelligence At S&P Global Market Intelligence, a division of S&P Global we understand the importance of accurate, deep and insightful information. Our team of experts delivers unrivaled insights and leading data and technology solutions, partnering with customers to expand their perspective, operate with confidence, and make decisions with conviction. For more information, visit www.spglobal.com/marketintelligence . What’s In It For You? Our Purpose: Progress is not a self-starter. It requires a catalyst to be set in motion. Information, imagination, people, technology–the right combination can unlock possibility and change the world. Our world is in transition and getting more complex by the day. We push past expected observations and seek out new levels of understanding so that we can help companies, governments and individuals make an impact on tomorrow. At S&P Global we transform data into Essential Intelligence®, pinpointing risks and opening possibilities. We Accelerate Progress. Our People: We're more than 35,000 strong worldwide—so we're able to understand nuances while having a broad perspective. Our team is driven by curiosity and a shared belief that Essential Intelligence can help build a more prosperous future for us all. From finding new ways to measure sustainability to analyzing energy transition across the supply chain to building workflow solutions that make it easy to tap into insight and apply it. We are changing the way people see things and empowering them to make an impact on the world we live in. We’re committed to a more equitable future and to helping our customers find new, sustainable ways of doing business. We’re constantly seeking new solutions that have progress in mind. Join us and help create the critical insights that truly make a difference. Our Values: Integrity, Discovery, Partnership At S&P Global, we focus on Powering Global Markets. Throughout our history, the world's leading organizations have relied on us for the Essential Intelligence they need to make confident decisions about the road ahead. We start with a foundation of integrity in all we do, bring a spirit of discovery to our work, and collaborate in close partnership with each other and our customers to achieve shared goals. Benefits: We take care of you, so you can take care of business. We care about our people. That’s why we provide everything you—and your career—need to thrive at S&P Global. Our benefits include: Health & Wellness: Health care coverage designed for the mind and body. Flexible Downtime: Generous time off helps keep you energized for your time on. Continuous Learning: Access a wealth of resources to grow your career and learn valuable new skills. Invest in Your Future: Secure your financial future through competitive pay, retirement planning, a continuing education program with a company-matched student loan contribution, and financial wellness programs. Family Friendly Perks: It’s not just about you. S&P Global has perks for your partners and little ones, too, with some best-in class benefits for families. Beyond the Basics: From retail discounts to referral incentive awards—small perks can make a big difference. For more information on benefits by country visit: https://spgbenefits.com/benefit-summaries Global Hiring and Opportunity at S&P Global: At S&P Global, we are committed to fostering a connected and engaged workplace where all individuals have access to opportunities based on their skills, experience, and contributions. Our hiring practices emphasize fairness, transparency, and merit, ensuring that we attract and retain top talent. By valuing different perspectives and promoting a culture of respect and collaboration, we drive innovation and power global markets. Recruitment Fraud Alert: If you receive an email from a spglobalind.com domain or any other regionally based domains, it is a scam and should be reported to reportfraud@spglobal.com . S&P Global never requires any candidate to pay money for job applications, interviews, offer letters, “pre-employment training” or for equipment/delivery of equipment. Stay informed and protect yourself from recruitment fraud by reviewing our guidelines, fraudulent domains, and how to report suspicious activity here . ----------------------------------------------------------- Equal Opportunity Employer S&P Global is an equal opportunity employer and all qualified candidates will receive consideration for employment without regard to race/ethnicity, color, religion, sex, sexual orientation, gender identity, national origin, age, disability, marital status, military veteran status, unemployment status, or any other status protected by law. Only electronic job submissions will be considered for employment. If you need an accommodation during the application process due to a disability, please send an email to: EEO.Compliance@spglobal.com and your request will be forwarded to the appropriate person. US Candidates Only: The EEO is the Law Poster http://www.dol.gov/ofccp/regs/compliance/posters/pdf/eeopost.pdf describes discrimination protections under federal law. Pay Transparency Nondiscrimination Provision - https://www.dol.gov/sites/dolgov/files/ofccp/pdf/pay-transp_%20English_formattedESQA508c.pdf ----------------------------------------------------------- IFTECH202.1 - Middle Professional Tier I (EEO Job Group) Job ID: 316176 Posted On: 2025-06-25 Location: Hyderabad, Telangana, India

Posted 1 month ago

Apply

4.0 - 6.0 years

2 - 6 Lacs

Hyderābād

On-site

About NationsBenefits: At NationsBenefits, we are leading the transformation of the insurance industry by developing innovative benefits management solutions. We focus on modernizing complex back-office systems to create scalable, secure, and high-performing platforms that streamline operations for our clients. As part of our strategic growth, we are focused on platform modernization — transitioning legacy systems to modern, cloud-native architectures that support the scalability, reliability, and high performance of core back-office functions in the insurance domain. We are seeking an experienced PHP Developer to design, develop, and maintain high-performance web applications. This role involves collaborating with cross-functional teams, optimizing application performance, and ensuring secure and scalable solutions . If you have a strong foundation in PHP development, frameworks, database management, and cloud technologies , we invite you to apply and contribute to cutting-edge projects. Key Responsibilities: Core PHP & Frameworks: Strong expertise in Core PHP and PHP web frameworks (preferably Symfony, Laravel, or CodeIgniter). Object-Oriented Programming (OOP): Deep understanding of OOP principles and MVC design patterns in PHP. Third-Party Integrations: Experience with third-party API integrations, authentication, and authorization mechanisms. Database Management: Strong proficiency in MySQL, knowledge of database normalization, ORM, and experience working with SQL/NoSQL databases. Web Development & Front-End: Familiarity with JavaScript, jQuery, VueJS, ReactJS, HTML5, CSS3, and front-end technologies. Security & Compliance: Knowledge of security best practices and compliance standards like HIPAA and GDPR. Application Design & Scalability: Understanding of scalable application architecture and secure authentication between systems. Cloud & DevOps: Hands-on experience with AWS cloud services, Docker containers, CI/CD pipelines, and automation scripts. Testing & Debugging: Proficiency in Test-Driven Development (TDD) and strong debugging skills. Version Control & Collaboration: Proficient with Git and working in a collaborative Agile/Scrum environment. Requirements: Education & Experience: Bachelor's degree in computer science, Information Technology, or a related field with proven PHP development experience up to 4-6 Years. PHP Frameworks: Strong expertise in Symfony, Laravel, or CodeIgniter. Front-End Development: Familiarity with HTML, CSS, JavaScript, jQuery. Database & API Management: Experience with MySQL, PostgreSQL, RESTful APIs, and web services. Version Control & CI/CD: Proficient in Git, CI/CD pipelines, and automation using shell scripts. Team Collaboration & Communication: Ability to work collaboratively, solve complex problems, and pay attention to detail. Preferred Qualifications: Agile & Scrum: Experience working in Agile/Scrum environments. Multi-Tech Expertise: Knowledge of additional programming languages (e.g., Python, JavaScript frameworks). Cloud & DevOps: Familiarity with AWS, Google Cloud, Docker, and Kubernetes.

Posted 1 month ago

Apply

8.0 years

0 Lacs

Telangana

On-site

8+ years of experience in data engineering or a related field. Strong expertise in Snowflake including schema design, performance tuning, and security. Proficiency in Python for data manipulation and automation. Solid understanding of data modeling concepts (star/snowflake schema, normalization, etc.). Experience with DBT for data transformation and documentation. Hands-on experience with ETL/ELT tools and orchestration frameworks (e.g., Airflow, Prefect). Strong SQL skills and experience with large-scale data sets. Familiarity with cloud platforms (AWS, Azure, or GCP) and data services.

Posted 1 month ago

Apply

5.0 years

0 Lacs

Hyderābād

On-site

Department Information Technology Job posted on Jun 26, 2025 Employee Type Permanent Experience range (Years) 5 years - 10 years Job Location Hyderabad Role Title Oracle DBA Administrator Role Purpose The purpose of the Business Development role is to identify, create, and nurture growth opportunities for the organization by building strategic relationships, expanding market presence, and driving revenue generation . The role involves proactively identifying new business prospects, developing tailored solutions to meet client needs, and working collaboratively across teams to close deals and foster long-term partnerships. Business Development professionals act as the bridge between market opportunities and the company's strategic goals , ensuring sustained business growth, competitive advantage, and customer success. Key Accountability Area Database Administration: Good knowledge on oracle 11g, 12c, 19c databases. Good knowledge on Structured Query Language (SQL) Comprehensive knowledge and hands-on experience in managing Oracle and MySQL Databases. Skill in optimizing database queries for better performance and understanding the importance of indexing, normalization, and denormalization. Minimize database downtime and manage parameters to provide fast query responses Monitoring databases and related systems to ensure optimized performance. Monitor database performance, implement changes and apply new patches and versions when required Exposure to Middleware (Oracle Forms and Reports) Applications would be a Significant Plus. System Monitoring and Maintenance: Perform regular system monitoring, verify the integrity and availability of Database, server resources, systems, and key processes, and review System and Application logs. Patch Management: Apply DB and OS patches and upgrades regularly and upgrade administrative tools and utilities. Configure and add new services as necessary. Troubleshooting and Support: Provide technical support and troubleshooting for server-related issues, ensuring minimal downtime and disruption. Backup and Recovery: Manage backup and recovery solutions for servers to ensure data integrity and availability. Documentation and Reporting: Maintain comprehensive documentation of systems, configurations, procedures, and changes. Provide regular reports on system performance and incidents. Reports to Lead- DBA No. of Reportees Individual Contributor Qualification Bachelor’s degree in computer science, Information Technology, or a related field. Work Experience Minimum of 2+ years of experience in Database administration. Proven expertise in managing complex Database environment’s Experience with Linux and Windows server OS. Technical / Functional Competencies Proficiency in Linux Server operating systems and technologies (RHEL, CentOS, Oracle Linux). Proficiency in Windows Server operating systems and technologies (Windows Server 2016, 2019,2022). Exposure to Oracle and AWS cloud platforms. Behavioral Competencies Excellent problem-solving and troubleshooting skills. Strong communication and interpersonal skills. Ability to work independently and as part of a team. Attention to detail and strong organizational skills.

Posted 1 month ago

Apply

5.0 years

0 Lacs

Mumbai Metropolitan Region

Remote

Company Description Forbes Advisor is a new initiative for consumers under the Forbes Marketplace umbrella that provides journalist- and expert-written insights, news and reviews on all things personal finance, health, business, and everyday life decisions. We do this by providing consumers with the knowledge and research they need to make informed decisions they can feel confident in, so they can get back to doing the things they care about most. The Data Research Engineering Team is a brand new team with the purpose of managing data from acquisition to presentation, collaborating with other teams while also operating independently. Their responsibilities include acquiring and integrating data, processing and transforming it, managing databases, ensuring data quality, visualizing data, automating processes, working with relevant technologies, and ensuring data governance and compliance. They play a crucial role in enabling data-driven decision-making and meeting the organization's data needs. A typical day in the life of a Database Engineer/Developer will involve designing, developing, and maintaining a robust and secure database infrastructure to efficiently manage company data. They collaborate with cross-functional teams to understand data requirements and migrate data from spreadsheets or other sources to relational databases or cloud-based solutions like Google BigQuery and AWS. They develop import workflows and scripts to automate data import processes, optimize database performance, ensure data integrity, and implement data security measures. Their creativity in problem-solving and continuous learning mindset contribute to improving data engineering processes. Proficiency in SQL, database design principles, and familiarity with Python programming are key qualifications for this role. Job Description Key Responsibilities Design, build, and maintain scalable and secure relational and cloud-based database systems. Migrate data from spreadsheets or third-party sources into databases (PostgreSQL, MySQL, BigQuery). Create and maintain automated workflows and scripts for reliable, consistent data ingestion. Optimize query performance and indexing to improve data retrieval efficiency. Implement access controls, encryption, and data security best practices to ensure compliance. Monitor database health and troubleshoot issues proactively using appropriate tools. Collaborate with full-stack developers and data researchers to align data architecture with application needs. Uphold data quality through validation rules, constraints, and referential integrity checks. Keep up-to-date with emerging technologies and propose improvements to data workflows. Leverage tools like Python (Pandas, SQLAlchemy, PyDrive), and version control (Git). Support Agile development practices and CI/CD pipelines where applicable. Required Skills And Experience Strong SQL skills and understanding of database design principles (normalization, indexing, relational integrity). Experience with relational databases such as PostgreSQL or MySQL. Working knowledge of Python, including data manipulation and scripting (e.g., using Pandas, SQLAlchemy). Experience with data migration and ETL processes, including integrating data from spreadsheets or external sources. Understanding of data security best practices, including access control, encryption, and compliance. Ability to write and maintain import workflows and scripts to automate data ingestion and transformation. Experience with cloud-based databases, such as Google BigQuery or AWS RDS. Familiarity with cloud services (e.g., AWS Lambda, GCP Dataflow) and serverless data processing. Exposure to data warehousing tools like Snowflake or Redshift. Experience using monitoring tools such as Prometheus, Grafana, or the ELK Stack. Good analytical and problem-solving skills, with strong attention to detail. Team collaboration skills, especially with developers and analysts, and ability to work independently. Proficiency with version control systems (e.g., Git). Strong communication skills — written and verbal. Preferred / Nice-to-Have Skills Bachelor’s degree in Computer Science, Information Systems, or a related field. Experience working with APIs for data ingestion and third-party system integration. Familiarity with CI/CD pipelines (e.g., GitHub Actions, Jenkins). Python experience using modules such as gspread, PyDrive, PySpark, or object-oriented design patterns. Experience in Agile/Scrum teams or working with product development cycles. Experience using Tableau and Tableau Prep for data visualization and transformation. Why Join Us Monthly long weekends — every third Friday off Wellness reimbursement to support your health and balance Paid parental leave Remote-first with flexibility and trust Work with a world-class data and marketing team inside a globally recognized brand Qualifications 5+ Years exp in Database Engineering. Additional Information Perks: Day off on the 3rd Friday of every month (one long weekend each month) Monthly Wellness Reimbursement Program to promote health well-being Monthly Office Commutation Reimbursement Program Paid paternity and maternity leaves

Posted 1 month ago

Apply

0 years

4 - 9 Lacs

Noida

On-site

Genpact (NYSE: G) is a global professional services and solutions firm delivering outcomes that shape the future. Our 125,000+ people across 30+ countries are driven by our innate curiosity, entrepreneurial agility, and desire to create lasting value for clients. Powered by our purpose – the relentless pursuit of a world that works better for people – we serve and transform leading enterprises, including the Fortune Global 500, with our deep business and industry knowledge, digital operations services, and expertise in data, technology, and AI. Inviting applications for the role of Lead Consultant – Java & GCP Developer In this role, you will be responsible for Developing Microsoft Access Databases, including tables, queries, forms and reports, using standard IT processes, with data normalization and referential integrity. Responsibilities Experience with Spring Boot Must have GCP Experience Experience with Microservices development Extensive Experience working with JAVA API with Oracle is critical. Extensive experience in Java 11 SE Experience with unit testing frameworks Junit or Mockito Experience with Maven/Gradle Professional, precise communication skills Experience in API designing, troubleshooting, and tuning for performance Professional, precise communication skills Experience designing, troubleshooting, API Java services and microservices Qualifications we seek in you! Minimum Qualifications BE /B.Tech/M.Tech/MCA Preferred qualifications Experience with Oracle 11g or 12c pl/sql is preferred Experience in health care or pharmacy related industries is preferred. Familiarity with Toad and/or SQL Developer tools Experience working with Angular, Spring Boot frame as well Experience with Kubernetes, Azure cloud Genpact is an Equal Opportunity Employer and considers applicants for all positions without regard to race, color, religion or belief, sex, age, national origin, citizenship status, marital status, military/veteran status, genetic information, sexual orientation, gender identity, physical or mental disability or any other characteristic protected by applicable laws. Genpact is committed to creating a dynamic work environment that values respect and integrity, customer focus, and innovation. For more information, visit www.genpact.com . Follow us on Twitter, Facebook, LinkedIn, and YouTube. Furthermore, please do note that Genpact does not charge fees to process job applications and applicants are not required to pay to participate in our hiring process in any other way. Examples of such scams include purchasing a 'starter kit,' paying to apply, or purchasing equipment or training. Job Lead Consultant Primary Location India-Noida Schedule Full-time Education Level Bachelor's / Graduation / Equivalent Job Posting Jun 25, 2025, 12:03:20 PM Unposting Date Ongoing Master Skills List Consulting Job Category Full Time

Posted 1 month ago

Apply

0 years

0 Lacs

Greater Chennai Area

On-site

Job Title : Snowflake Data Engineer Location : Chennai Job Type : Full Time Job Summary: We are looking for a skilled and detail-oriented Snowflake Data Engineer to join our data engineering team. The ideal candidate should have hands-on experience with Snowflake, DBT, SQL, and any one of the cloud platforms (AWS, Azure, or GCP). Experience or exposure to Python for data transformation or scripting is a plus. Required Skills: Strong experience with Snowflake data warehousing architecture and features. Hands-on expertise in DBT (Data Build Tool) for transformation and modelling. Proficiency in SQL – complex joins, window functions, performance tuning. Experience in at least one major cloud platform: AWS, Azure, or GCP. Knowledge of data modelling (dimensional/star schema, normalization, etc.) Familiarity with CI/CD pipelines for data deployments.

Posted 1 month ago

Apply

0 years

0 Lacs

Coimbatore, Tamil Nadu, India

On-site

Job Description The ideal candidate must possess strong communication skills, with an ability to listen and comprehend information and share it with all the key stakeholders, highlighting opportunities for improvement and concerns, if any. He/she must be able to work collaboratively with teams to execute tasks within defined timeframes while maintaining high-quality standards and superior service levels. The ability to take proactive actions and willingness to take up responsibility beyond the assigned work area is a plus. Apprentice_Analyst Roles and responsibilities: Data enrichment/gap fill, standardization, normalization, and categorization of online and offline product data via research through different sources like internet, specific websites, database, etc. Data quality check and correction Data profiling and reporting (basic) Email communication with the client on request acknowledgment, project status and response on queries Help customers in enhancing their product data quality (electrical, mechanical, electronics) from the technical specification and description perspective Provide technical consulting to the customer category managers around the industry best practices of product data enhancement Technical and Functional Skills: Bachelor’s Degree in Engineering from Electrical, Mechanical OR Electronics stream Excellent technical knowledge of engineering products (Pumps, motors, HVAC, Plumbing, etc.) and technical specifications Intermediate knowledge of MS Office/Internet.

Posted 1 month ago

Apply

6.0 years

0 Lacs

Kolkata, West Bengal, India

On-site

JOB_POSTING-3-71879-1 Job Description Role Title : AVP, Enterprise Logging & Observability (L11) Company Overview Synchrony (NYSE: SYF) is a premier consumer financial services company delivering one of the industry’s most complete digitally enabled product suites. Our experience, expertise and scale encompass a broad spectrum of industries including digital, health and wellness, retail, telecommunications, home, auto, outdoors, pet and more. We have recently been ranked #2 among India’s Best Companies to Work for by Great Place to Work. We were among the Top 50 India’s Best Workplaces in Building a Culture of Innovation by All by GPTW and Top 25 among Best Workplaces in BFSI by GPTW. We have also been recognized by AmbitionBox Employee Choice Awards among the Top 20 Mid-Sized Companies, ranked #3 among Top Rated Companies for Women, and Top-Rated Financial Services Companies. Synchrony celebrates ~51% women diversity, 105+ people with disabilities, and ~50 veterans and veteran family members. We offer Flexibility and Choice for all employees and provide best-in-class employee benefits and programs that cater to work-life integration and overall well-being. We provide career advancement and upskilling opportunities, focusing on Advancing Diverse Talent to take up leadership roles. Organizational Overview Splunk is Synchrony's enterprise logging solution. Splunk searches and indexes log files and helps derive insights from the data. The primary goal is, to ingests massive datasets from disparate sources and employs advanced analytics to automate operations and improve data analysis. It also offers predictive analytics and unified monitoring for applications, services and infrastructure. There are many applications that are forwarding data to the Splunk logging solution. Splunk team including Engineering, Development, Operations, Onboarding, Monitoring maintain Splunk and provide solutions to teams across Synchrony. Role Summary/Purpose The role AVP, Enterprise Logging & Observability is a key leadership role responsible for driving the strategic vision, roadmap, and development of the organization’s centralized logging and observability platform. This role supports multiple enterprise initiatives including applications, security monitoring, compliance reporting, operational insights, and platform health tracking. This role lead platform development using Agile methodology, manage stakeholder priorities, ensure logging standards across applications and infrastructure, and support security initiatives. This position bridges the gap between technology teams, applications, platforms, cloud, cybersecurity, infrastructure, DevOps, Governance audit, risk teams and business partners, owning and evolving the logging ecosystem to support real-time insights, compliance monitoring, and operational excellence. Key Responsibilities Splunk Development & Platform Management Lead and coordinate development activities, ingestion pipeline enhancements, onboarding frameworks, and alerting solutions. Collaborate with engineering, operations, and Splunk admins to ensure scalability, performance, and reliability of the platform. Establish governance controls for source naming, indexing strategies, retention, access controls, and audit readiness. Splunk ITSI Implementation & Management - Develop and configure ITSI services, entities, and correlation searches. Implement notable events aggregation policies and automate response actions. Fine-tune ITSI performance by optimizing data models, summary indexing, and saved searches. Help identify patterns and anomalies in logs and metrics. Develop ML models for anomaly detection, capacity planning, and predictive analytics. Utilize Splunk MLTK to build and train models for IT operations monitoring. Security & Compliance Enablement Partner with InfoSec, Risk, and Compliance to align logging practices with regulations (e.g., PCI-DSS, GDPR, RBI). Enable visibility for encryption events, access anomalies, secrets management, and audit trails. Support security control mapping and automation through observability. Stakeholder Engagement Act as a strategic advisor and point of contact for business units, application, infrastructure, security stakeholders and business teams leveraging Splunk. Conduct stakeholder workshops, backlog grooming, and sprint reviews to ensure alignment. Maintain clear and timely communications across all levels of the organization. Process & Governance Drive logging and observability governance standards, including naming conventions, access controls, and data retention policies. Lead initiatives for process improvement in log ingestion, normalization, and compliance readiness. Ensure alignment with enterprise architecture and data classification models. Lead improvements in logging onboarding lifecycle time, automation pipelines, and selfservice ingestion tools. Mentor junior team members and guide engineering teams on secure, standardized logging practices. Required Skills/Knowledge Bachelor's degree with Minimum of 6+ years of experience in Technology ,or in lieu of a degree 8+ years of Experience in Technology Minimum of 3+ years of experience in leading development team or equivalent role in observability, logging, or security platforms. Splunk Subject Matter Expert (SME) Strong hands-on understanding of Splunk architecture, pipelines, dashboards, and alerting, data ingestion, search optimization, and enterprise-scale operations. Experience supporting security use cases, encryption visibility, secrets management, and compliance logging. Splunk Development & Platform Management, Security & Compliance Enablement, Stakeholder Engagement & Process & Governance Experience with Splunk Premium Apps - ITSI and Enterprise Security (ES) minimally Experience with Data Streaming Platforms & tools like Cribl, Splunk Edge Processor. Proven ability to work in Agile environments using tools such as JIRA or JIRA Align. Strong communication, leadership, and stakeholder management skills. Familiarity with security, risk, and compliance standards relevant to BFSI. Proven experience leading product development teams and managing cross-functional initiatives using Agile methods. Strong knowledge and hands-on experience with Splunk Enterprise/Splunk Cloud. Design and implement Splunk ITSI solutions for proactive monitoring and service health tracking. Develop KPIs, Services, Glass Tables, Entities, Deep Dives, and Notable Events to improve service reliability for users across the firm Develop scripts (python, JavaScript, etc.) as needed in support of data collection or integration Develop new applications leveraging Splunk’s analytic and Machine Learning tools to maximize performance, availability and security improving business insight and operations. Support senior engineers in analyzing system issues and performing root cause analysis (RCA). Desired Skills/Knowledge Deep knowledge of Splunk development, data ingestion, search optimization, alerting, dashboarding, and enterprise-scale operations. Exposure to SIEM integration, security orchestration, or SOAR platforms. Knowledge of cloud-native observability (e.g. AWS/GCP/Azure logging). Experience in BFSI or regulated industries with high-volume data handling. Familiarity with CI/CD pipelines, DevSecOps integration, and cloud-native logging. Working knowledge of scripting or automation (e.g., Python, Terraform, Ansible) for observability tooling. Splunk certifications (Power User, Admin, Architect, or equivalent) will be an advantage . Awareness of data classification, retention, and masking/anonymization strategies. Awareness of integration between Splunk and ITSM or incident management tools (e.g., ServiceNow, PagerDuty) Experience with Version Control tools – Git, Bitbucket Eligibility Criteria Bachelor's degree with Minimum of 6+ years of experience in Technology ,or in lieu of a degree 8+ years of Experience in Technology Minimum of 3+ years of experience in leading development team or equivalent role in observability, logging, or security platforms. Demonstrated success in managing large-scale logging platforms in regulated environments. Excellent communication, leadership, and cross-functional collaboration skills. Experience with scripting languages such as Python, Bash, or PowerShell for automation and integration purposes. Prior experience in large-scale, security-driven logging or observability platform development. Excellent problem-solving skills and the ability to work independently or as part of a team. Strong communication and interpersonal skills to interact effectively with team members and stakeholders. Knowledge of IT Service Management (ITSM) and monitoring tools. Knowledge of other data analytics tools or platforms is a plus. WORK TIMINGS : 01:00 PM to 10:00 PM IST This role qualifies for Enhanced Flexibility and Choice offered in Synchrony India and will require the incumbent to be available between 06:00 AM Eastern Time – 11:30 AM Eastern Time (timings are anchored to US Eastern hours and will adjust twice a year locally). This window is for meetings with India and US teams. The remaining hours will be flexible for the employee to choose. Exceptions may apply periodically due to business needs. Please discuss this with the hiring manager for more details. For Internal Applicants Understand the criteria or mandatory skills required for the role, before applying Inform your manager and HRM before applying for any role on Workday Ensure that your professional profile is updated (fields such as education, prior experience, other skills) and it is mandatory to upload your updated resume (Word or PDF format) Must not be any corrective action plan (First Formal/Final Formal, PIP) L9+ Employees who have completed 18 months in the organization and 12 months in current role and level are only eligible. L09+ Employees can apply. Level / Grade : 11 Job Family Group Information Technology

Posted 1 month ago

Apply

5.0 years

0 Lacs

Greater Chennai Area

On-site

Job Overview Plan A Technologies is looking for an MS SQL Server DB develope r . This is a fast-paced job with room for significant career growth. Please note: you must have at least 5+ years of experience as a MS SQL Server Developer or Database Developer to be considered for this role. JOB RESPONSIBILITY Develop, maintain, and optimize database solutions using SQL Server. Write efficient T-SQL queries, stored procedures, triggers, and functions. Perform database schema design, normalization, and optimization. Collaborate with developers, analysts, and stakeholders to understand database requirements. Optimize database performance through query optimization and indexing. Troubleshoot and resolve database issues such as performance bottlenecks and data corruption. Participate in code reviews, testing, and deployment activities. Stay updated on emerging database technologies and trends. Experience 5-7 years of experience as a MS SQL Server Developer or Database Developer. Proficiency in T-SQL and experience with SQL Server versions (2012/2014/2016/2019). Strong understanding of database concepts including normalization, indexing, and transactions. Experience with database administration tasks such as backup, recovery, and security. Familiarity with ETL tools for data integration (e.g., SSIS, Azure Data Factory). Knowledge in SSRS is an advantage. Excellent problem-solving skills and attention to detail. Excellent communication skills: must have at least Upper-Intermediate-level English (both verbal and written) Advanced problem-solving abilities, research, and learning skills Ability to work with engineers in multiple countries Must have an organized and analytical working style, and the ability to plan your own work Initiative and drive to do great things About The Company/Benefits Plan A Technologies is an American software development and technology advisory firm that brings top-tier engineering talent to clients around the world. Our software engineers tackle custom product development projects, staff augmentation, major integrations and upgrades, and much more. The team is far more hands-on than the giant outsourcing shops, but still big enough to handle major enterprise clients. Read more about us here: www.PlanAtechnologies.com . Location: Chennai, India – Hybrid work schedule: you will be required to work from our Chennai office for a minimum of 2 weeks per month. Great colleagues and an upbeat work environment: You'll join an excellent team of supportive engineers and project managers who work hard but don't ever compete with each other. Benefits: Vacation, Brand New Laptop, and More: You’ll get a generous vacation schedule, and other goodies. If this sounds like you, we'd love to hear from you!

Posted 1 month ago

Apply

3.0 years

0 Lacs

Greater Kolkata Area

On-site

Project Role : Program/Project Management Representativ Project Role Description : Deliver business and technology outcomes for assigned program, project, or contracted service. Leverage standard tools, methodologies and processes to deliver, monitor, and control service level agreements. Must have skills : Laboratory Information and Execution Systems Good to have skills : Life Sciences Minimum 3 Year(s) Of Experience Is Required Educational Qualification : 15 years full time education Summary: LabVantage, Design, develop, and maintain software applications using Laboratory Information Management System (LIMS). Collaborate with cross-functional teams to ensure seamless integration with other IT components. Conduct rigorous system testing and troubleshooting to optimize the performance of software applications. Provide expert technical guidance and support to project teams throughout the implementation lifecycle. Ensure compliance with software development standards and best practices Roles & Responsibilities: - As an LabVantage, application Developer, your day-to-day activities will revolve around leveraging your advanced proficiency in Laboratory Information Management System (LIMS) to develop and maintain software applications. - You'll be responsible for designing, coding, testing, and debugging software applications. - You'll be entrusted with the task of ensuring seamless integration with other IT components, thus playing a significant role in contributing to the organization's overall success. - You must have advanced proficiency in Laboratory Information Management System (LIMS). - Having intermediate proficiency in Configuration & Release Management and advanced proficiency in Design & Build Enablement will be advantageous. - Expected to perform independently and become an SME. - Required active participation/contribution in team discussions. - Contribute in providing solutions to work-related problems. - Collaborate with stakeholders to define project objectives and scope. - Develop and maintain project plans, including timelines, budgets, and resource allocation. - Monitor project progress and ensure adherence to timelines and deliverables. - Identify and mitigate project risks and issues. Professional & Technical Skills: - Must To Have Skills: Proficiency in Laboratory Information and Execution Systems. - Strong understanding of statistical analysis and machine learning algorithms. - Experience with data visualization tools such as Tableau or Power BI. - Hands-on implementing various machine learning algorithms such as linear regression, logistic regression, decision trees, and clustering algorithms. - Solid grasp of data munging techniques, including data cleaning, transformation, and normalization to ensure data quality and integrity. Additional Information: - The candidate should have a minimum of 3 years of experience in Laboratory Information and Execution Systems. - This position is based at our Bengaluru office. - A 15 years full-time education is required. 15 years full time education

Posted 1 month ago

Apply

4.0 - 8.0 years

0 Lacs

Bengaluru, Karnataka, India

On-site

Job Details Category: Data Science Location : Bangalore Experience Level: 4-8 Years Position Description We are looking for a Data Engineer who will play a pivotal role in transforming raw data into actionable intelligence through sophisticated data pipelines and machine learning deployment frameworks. They will collaborate across functions to understand business objectives, engineer data solutions, and ensure robust AI/ML model deployment and monitoring. This role is ideal for someone passionate about data science, MLOps, and building scalable data ecosystems in cloud environments. Key Responsibilities Data Engineering & Data Science: Preprocess structured and unstructured data to prepare for AI/ML model development. Apply strong skills in feature engineering, data augmentation, and normalization techniques. Manage and manipulate data using SQL, NoSQL, and cloud-based data storage solutions such as Azure Data Lake. Design and implement efficient ETL pipelines, data wrangling, and data transformation strategies. Model Deployment & MLOps Deploy ML models into production using Azure Machine Learning (Azure ML) and Kubernetes. Implement MLOps best practices, including CI/CD pipelines, model versioning, and monitoring frameworks. Design mechanisms for model performance monitoring, alerting, and retraining. Utilize containerization technologies (Docker/Kubernetes) to support deployment and scalability Business & Analytics Insights Work closely with stakeholders to understand business KPIs and decision-making frameworks. Analyze large datasets to identify trends, patterns, and actionable insights that inform business strategies. Develop data visualizations using tools like Power BI, Tableau, and Matplotlib to communicate insights effectively. Conduct A/B testing and evaluate model performance using metrics such as precision, recall, F1-score, MSE, RMSE, and model validation techniques. Desired Profile Proven experience in data engineering, AI/ML data preprocessing, and model deployment. Strong expertise in working with both structured and unstructured datasets. Hands-on experience with SQL, NoSQL databases, and cloud data platforms (e.g., Azure Data Lake). Deep understanding of MLOps practices, containerization (Docker/Kubernetes), and production-level model deployment. Technical Skills Proficient in ETL pipeline creation, data wrangling, and transformation methods. Strong experience with Azure ML, Kubernetes, and other cloud-based deployment technologies. Excellent knowledge of data visualization tools (Power BI, Tableau, Matplotlib). Expertise in model evaluation and testing techniques, including A/B testing and performance metrics. Soft Skills Strong analytical mindset with the ability to solve complex data-related problems. Ability to collaborate with cross-functional teams to understand business needs and provide actionable insights. Clear communication skills to convey technical details to non-technical stakeholders. If you are passionate to work in a collaborative and challenging environment, apply now!

Posted 1 month ago

Apply

5.0 years

0 Lacs

Hyderabad, Telangana, India

On-site

Job Title – Data Engineer (SQL Server, Python, AWS, ETL) Preferred Location: Hyderabad, India Full time/Part Time - Full Time Build a career with confidence Carrier Global Corporation, global leader in intelligent climate and energy solutions is committed to creating solutions that matter for people and our planet for generations to come. From the beginning, we've led in inventing new technologies and entirely new industries. Today, we continue to lead because we have a world-class, diverse workforce that puts the customer at the center of everything we do. Role Description Will work with high-performance software engineering and Analytics teams that consistently deliver on commitments with continuous quality and efficiency improvements. In this role, you will develop technical capabilities for several of Carrier’s software development teams, supporting both current and next-generation technology initiatives. This position requires a demonstrated, hands-on technical person with the ability delivery technical tasks and owns development phase of software development, including coding, troubleshooting, deployment, and ongoing maintenance. Role Responsibilities Design, develop, and implement SQL Server databases based on business requirements and best practices. Create database schema, tables, views, stored procedures, and functions to support application functionality and data access. Ensure data integrity, security, and performance through proper database design and normalization techniques. Analyze query execution plans and performance metrics to identify and address performance bottlenecks. Implement indexing strategies and database optimizations to improve query performance. Design and implement ETL processes to extract, transform, and load data from various sources into SQL Server databases. Document database configurations, performance tuning activities, and Power BI solutions for knowledge sharing and future reference. Provide training and support to end-users on SQL Server best practices, database performance optimization techniques, and Power BI usage. Minimum Requirements BTech degree in Computer Science or related discipline, MTech degree preferred. Assertive communication, strong analytical, problem solving, debugging, and leadership skills. Experience with source control tools like Bit Bucket and/or Git. Good Hands-on experience diagnosing performance bottlenecks, wait stats, SQL query monitoring, review and optimization strategies. Create normalized and highly scalable logical and physical database design and switch between different database technologies like Oracle, SQL Server, Elastic databases. 5+ years of overall experience building and maintaining SQL server and data engineering for the organization. 5+ year SQL server development experience with strong programming experience in writing stored procedures and functions. Excellent understanding of Snowflake and other data warehouses. Experience in designing and hands-on development in cloud-based analytics solutions. Understanding on AWS storage services and AWS Cloud Infrastructure offerings. Designing and building data pipelines using API ingestion and Streaming ingestion methods. Knowledge of Dev-Ops processes (including CI/CD) and Infrastructure as code is essential. Benefits We are committed to offering competitive benefits programs for all of our employees, and enhancing our programs when necessary. Have peace of mind and body with our health insurance Make yourself a priority with flexible schedules and leave Policy Drive forward your career through professional development opportunities Achieve your personal goals with our Employee Assistance Program. Our commitment to you Our greatest assets are the expertise, creativity and passion of our employees. We strive to provide a great place to work that attracts, develops and retains the best talent, promotes employee engagement, fosters teamwork and ultimately drives innovation for the benefit of our customers. We strive to create an environment where you feel that you belong, with diversity and inclusion as the engine to growth and innovation. We develop and deploy best-in-class programs and practices, providing enriching career opportunities, listening to employee feedback and always challenging ourselves to do better. This is The Carrier Way. Join us and make a difference. Apply Now! Carrier is An Equal Opportunity/Affirmative Action Employer. All qualified applicants will receive consideration for employment without regard to race, color, religion, sex, sexual orientation, gender identity, national origin, disability or veteran status, age or any other federally protected class. Job Applicant's Privacy Notice Click on this link to read the Job Applicant's Privacy Notice

Posted 1 month ago

Apply

0 years

0 Lacs

Coimbatore, Tamil Nadu, India

On-site

Job Description The ideal candidate must possess strong communication skills, with an ability to listen and comprehend information and share it with all the key stakeholders, highlighting opportunities for improvement and concerns, if any. He/she must be able to work collaboratively with teams to execute tasks within defined timeframes while maintaining high-quality standards and superior service levels. The ability to take proactive actions and willingness to take up responsibility beyond the assigned work area is a plus. Apprentice_Analyst Roles and responsibilities: Data enrichment/gap fill, standardization, normalization, and categorization of online and offline product data via research through different sources like internet, specific websites, database, etc. Data quality check and correction Data profiling and reporting (basic) Email communication with the client on request acknowledgment, project status and response on queries Help customers in enhancing their product data quality (electrical, mechanical, electronics) from the technical specification and description perspective Provide technical consulting to the customer category managers around the industry best practices of product data enhancement Technical And Functional Skills Bachelor’s Degree in Engineering from Electrical, Mechanical OR Electronics stream Excellent technical knowledge of engineering products (Pumps, motors, HVAC, Plumbing, etc.) and technical specifications Intermediate knowledge of MS Office/Internet.

Posted 1 month ago

Apply

6.0 years

0 Lacs

Pune, Maharashtra, India

On-site

Job Description Role : Team Leader - Service Desk Location : Pune/Bangalore Job Summary –  Candidates with a minimum 6 years of Service Desk experience with minimum 2 years in Front Line Leadership / Management role– We are looking for candidates with domain expertise in End User Support Services, and skilled in technical troubleshooting and delivery operations management.  Passport (Mandate); Advantage - US business visa (B1) Years of experience needed – 5-8 years Technical Skills  Analytical skills  Effective Business Communication  Coaching skills  Operations Management  SLA Management  MS Office  Operational knowledge of contact center platform and ITSM tool  Performance Management skills  Conflict management skills  Capacity management  Presentation skills  Training need identification  Technical Skills-Client Technical Service Awareness – Intermediate  Technical Troubleshooting - Account Management/password reset - Advance.  Technical Troubleshooting - OS – Advance  Technical Troubleshooting - End Devices - Advance  Ticketing Tool – Advance  MS Office – Intermediate  Contact center platform operating skills – Intermediate.  Contact center platform reports – Intermediate.  Networking concepts – Intermediate  Client Process Knowledge – Advanced  DMAIC Methodology – Intermediate  Client Business Awareness – Advanced  Telephone etiquette – Expert.  Email etiquette – Expert.  Customer service skills – Expert  Knowledge Base Navigation Skills – Advanced  Analytical skills – Intermediate  Operations Management – Advanced  SLA Management – Intermediate  Effective Business Communication – Advance  Decision Making Skills – Advance  Measuring Performance/Performance Management Skills – Advance  Coaching for Success – Advance  Motivating Others – Advance  Conflict Management Skills – Advance  Patience – Advance  Managing Stress – Advance  Positive attitude to change – Advance.  Attitude to feedback/willing to learn – Advance.  Relating to Others – Advance  Influencing Others – Advance  Team Player – Advance  Insight into the Customer's Mindset – Advance  Solution Based Approach – Advance  Follow Through – Advance  Personal Credibility – Advance  Self-Development – Intermediate  Result Focus – Intermediate  Drive to Win – Intermediate  Recognize Efforts – Advanced  Approachability – Advanced  Dealing with Fairness – Expert  Fostering Teamwork - Advanced Management Skills  Supervise and review Service Desk activities.  Review and ensure compliance to standards like PCI, ISO, ISMS, BCMS by facilitating audits by internal and external teams.  Place hiring request and conducting interviews.  Work with HR and support groups to improve employee retention and satisfaction.  In-person feedback to reporting agents on daily basis regarding ticket hygiene and operational/procedural hygiene  Root cause analysis, tracking and reporting of escalation and SLA misses.  Attend change meetings and analyze potential impact to Service Desk operations.  Performance appraisal and normalization  Participate in calibration and collaboration meetings with support function leads.  Conduct new hire technical and account specific training based on the requirements.  Create, maintain, and update account training plan.  Provide hands-on assistance to team members in case of issues, both through direct intervention and mentoring  Prepare Score Cards and discuss and share feedback around improvement areas.  Identify top performers and nominate for Rewards and Recognition and appreciation.  Monitor ticket ageing reports and drive team members to work on ageing tickets.  FCR analysis - find out controllable resolution errors that could have been resolved at L1. Behavioral Skills  Good in communication  Positive energy  Positive attitude  Self-learner Qualification  Any Graduate Certification  ITIL certified. About Mphasis Mphasis applies next-generation technology to help enterprises transform businesses globally. Customer centricity is foundational to Mphasis and is reflected in the Mphasis’ Front2Back™ Transformation approach. Front2Back™ uses the exponential power of cloud and cognitive to provide hyper-personalized (C=X2C2TM=1) digital experience to clients and their end customers. Mphasis’ Service Transformation approach helps ‘shrink the core’ through the application of digital technologies across legacy environments within an enterprise, enabling businesses to stay ahead in a changing world. Mphasis’ core reference architectures and tools, speed and innovation with domain expertise and specialization are key to building strong relationships with marquee clients.

Posted 1 month ago

Apply

7.0 years

0 Lacs

India

Remote

Role Overview: We are looking for a highly skilled and experienced ServiceNow professional (7+ years) to join our freelance technical interview panel . As a Panelist, you’ll play a critical role in assessing candidates for ServiceNow Developer, Admin, and Architect roles by conducting deep technical interviews and evaluating hands-on expertise, problem-solving skills, and platform knowledge. This is an excellent opportunity for technically strong freelancers who enjoy sharing their expertise, influencing hiring decisions, and working flexible hours remotely. Key Responsibilities: Conduct live technical interviews and evaluations over video calls (aligned to EST hours) Assess candidates’ practical expertise in: Core ServiceNow modules (ITSM, CMDB, Discovery, Incident/Change/Problem) Custom application development & configuration Client/Server-side scripting (JavaScript, Business Rules, UI Policies, Script Includes) Integrations (REST/SOAP APIs, Integration Hub) Flow Designer, Service Portal, ACLs, ATF, and CI/CD practices Review coding tasks and scenario-based architecture questions Provide detailed, structured feedback and recommendations to the hiring team Collaborate on refining technical evaluation criteria if needed Required Skills & Experience (Advanced Technical Expertise): 10+ years of extensive hands-on experience with the ServiceNow platform in enterprise-grade environments Strong command over ServiceNow Core Modules : ITSM, ITOM, CMDB, Asset & Discovery, Incident/Change/Problem/Knowledge Management Proven expertise in custom application development using scoped apps, App Engine Studio, and Now Experience UI Framework Deep proficiency in ServiceNow scripting , including: Server-side : Business Rules, Script Includes, Scheduled Jobs, GlideRecord, GlideAggregate Client-side : UI Policies, Client Scripts, UI Actions, GlideForm/GlideUser APIs Middleware logic for cross-platform communication and custom handlers Experience implementing Access Control Lists (ACLs) with dynamic filters and condition-based restrictions Expert in Service Portal customization using AngularJS widgets, Bootstrap, and custom REST endpoints Proficient in Integration Hub , Custom REST/SOAP APIs , OAuth 2.0 authentication, MID Server integrations, external system integration (e.g., SAP, Azure, Jira, Dynatrace, etc.) Hands-on with Flow Designer , Orchestration , and Event Management Expertise in ServiceNow CMDB , CI Class modeling, reconciliation rules, identification/normalization strategies, and dependency mappings Familiarity with ServiceNow Performance Tuning : Scheduled Jobs optimization, lazy loading, database indexing, client/server execution efficiency Working knowledge of Automated Test Framework (ATF) and integration with CI/CD pipelines (Jenkins, Git, Azure DevOps) Understanding of ServiceNow DevOps , version control, scoped app publishing, and update set migration best practices Knowledge of Security Operations (SecOps) and Governance, Risk & Compliance (GRC) is a plus Experience guiding architectural decisions, governance models, and platform upgrade strategies Prior experience conducting technical interviews, design evaluations , or acting as a technical SME/panelist Excellent communication and feedback documentation skills — able to clearly explain technical rationale and candidate assessments Comfortable working independently and engaging with global stakeholders during USA EST hours (after 8 PM IST)

Posted 1 month ago

Apply

3.0 years

0 Lacs

Hyderabad, Telangana, India

On-site

Description Job summary Amazon.com’s Buyer Risk Prevention (BRP) mission is to make Amazon the safest and most trusted place worldwide to transact online. Amazon runs one of the most dynamic e-commerce marketplaces in the world, with nearly 2 million sellers worldwide selling hundreds of millions of items in ten countries. BRP safeguards every financial transaction across all Amazon sites. As such, BRP designs and builds the software systems, risk models, and operational processes that minimize risk and maximize trust in Amazon.com. The BRP organization is looking for a data scientist for its Risk Mining Analytics (RMA) team, whose mission is to combine advanced analytics with investigator insight to detect negative customer experiences, improve system effectiveness, and prevent bad debt across Amazon. As a data scientist in risk mining, you will be responsible for modeling complex problems, discovering insights, and building risk algorithms that identify opportunities through statistical models, machine learning, and visualization techniques to improve operational efficiency and reduce bad debt. You will need to collaborate effectively with business and product leaders within BRP and cross-functional teams to build scalable solutions against high organizational standards. The candidate should be able to apply a breadth of tools, data sources, and data science techniques to answer a wide range of high-impact business questions and proactively present new insights in a concise and effective manner. The candidate should be an effective communicator capable of independently driving issues to resolution and communicating insights to non-technical audiences. This is a high-impact role with goals that directly impact the bottom line of the business. Key job responsibilities Key job responsibilities Analyze terabytes of data to define and deliver on complex analytical deep dives to unlock insights and build scalable solutions through Data Science to ensure security of Amazon’s platform and transactions Build Machine Learning and/or statistical models that evaluate the transaction legitimacy and track impact over time Ensure data quality throughout all stages of acquisition and processing, including data sourcing/collection, ground truth generation, normalization, transformation, and cross-lingual alignment/mapping Define and conduct experiments to validate/reject hypotheses, and communicate insights and recommendations to Product and Tech teams Develop efficient data querying infrastructure for both offline and online use cases Collaborate with cross-functional teams from multidisciplinary science, engineering and business backgrounds to enhance current automation processes Learn and understand a broad range of Amazon’s data resources and know when, how, and which to use and which not to use. Maintain technical document and communicate results to diverse audiences with effective writing, visualizations, and presentations Basic Qualifications 3+ years of data querying languages (e.g. SQL), scripting languages (e.g. Python) or statistical/mathematical software (e.g. R, SAS, Matlab, etc.) experience 2+ years of data scientist experience 3+ years of machine learning/statistical modeling data analysis tools and techniques, and parameters that affect their performance experience Experience applying theoretical models in an applied environment Preferred Qualifications Experience in Python, Perl, or another scripting language Our inclusive culture empowers Amazonians to deliver the best results for our customers. If you have a disability and need a workplace accommodation or adjustment during the application and hiring process, including support for the interview or onboarding process, please visit https://amazon.jobs/content/en/how-we-hire/accommodations for more information. If the country/region you’re applying in isn’t listed, please contact your Recruiting Partner. Company - Amazon Dev Center India - Hyderabad Job ID: A2998274

Posted 1 month ago

Apply

5.0 years

5 - 9 Lacs

Hyderābād

On-site

Hyderabad, Telangana Job ID 30162733 Job Category Digital Technology Job Title – Data Engineer (SQL Server, Python, AWS, ETL) Preferred Location: Hyderabad, India Full time/Part Time - Full Time Build a career with confidence Carrier Global Corporation, global leader in intelligent climate and energy solutions is committed to creating solutions that matter for people and our planet for generations to come. From the beginning, we've led in inventing new technologies and entirely new industries. Today, we continue to lead because we have a world-class, diverse workforce that puts the customer at the center of everything we do. Role Description: Will work with high-performance software engineering and Analytics teams that consistently deliver on commitments with continuous quality and efficiency improvements. In this role, you will develop technical capabilities for several of Carrier’s software development teams, supporting both current and next-generation technology initiatives. This position requires a demonstrated, hands-on technical person with the ability delivery technical tasks and owns development phase of software development, including coding, troubleshooting, deployment, and ongoing maintenance. Role Responsibilities: Design, develop, and implement SQL Server databases based on business requirements and best practices. Create database schema, tables, views, stored procedures, and functions to support application functionality and data access. Ensure data integrity, security, and performance through proper database design and normalization techniques. Analyze query execution plans and performance metrics to identify and address performance bottlenecks. Implement indexing strategies and database optimizations to improve query performance. Design and implement ETL processes to extract, transform, and load data from various sources into SQL Server databases. Document database configurations, performance tuning activities, and Power BI solutions for knowledge sharing and future reference. Provide training and support to end-users on SQL Server best practices, database performance optimization techniques, and Power BI usage. Minimum Requirements: BTech degree in Computer Science or related discipline, MTech degree preferred. Assertive communication, strong analytical, problem solving, debugging, and leadership skills. Experience with source control tools like Bit Bucket and/or Git. Good Hands-on experience diagnosing performance bottlenecks, wait stats, SQL query monitoring, review and optimization strategies. Create normalized and highly scalable logical and physical database design and switch between different database technologies like Oracle, SQL Server, Elastic databases. 5+ years of overall experience building and maintaining SQL server and data engineering for the organization. 5+ year SQL server development experience with strong programming experience in writing stored procedures and functions. Excellent understanding of Snowflake and other data warehouses. Experience in designing and hands-on development in cloud-based analytics solutions. Understanding on AWS storage services and AWS Cloud Infrastructure offerings. Designing and building data pipelines using API ingestion and Streaming ingestion methods. Knowledge of Dev-Ops processes (including CI/CD) and Infrastructure as code is essential. Benefits We are committed to offering competitive benefits programs for all of our employees, and enhancing our programs when necessary. Have peace of mind and body with our health insurance Make yourself a priority with flexible schedules and leave Policy Drive forward your career through professional development opportunities Achieve your personal goals with our Employee Assistance Program. Our commitment to you Our greatest assets are the expertise, creativity and passion of our employees. We strive to provide a great place to work that attracts, develops and retains the best talent, promotes employee engagement, fosters teamwork and ultimately drives innovation for the benefit of our customers. We strive to create an environment where you feel that you belong, with diversity and inclusion as the engine to growth and innovation. We develop and deploy best-in-class programs and practices, providing enriching career opportunities, listening to employee feedback and always challenging ourselves to do better. This is The Carrier Way. Join us and make a difference. Now! Carrier is An Equal Opportunity/Affirmative Action Employer. All qualified applicants will receive consideration for employment without regard to race, color, religion, sex, sexual orientation, gender identity, national origin, disability or veteran status, age or any other federally protected class.

Posted 1 month ago

Apply

2.0 years

0 Lacs

India

On-site

Job Title: DBMS Trainer Location: Hyderabad, Telangana Experience Required: Minimum 2 years in database development or training Employment Type: Full-Time, Onsite Job Summary: We are seeking a dynamic and experienced DBMS Trainer to join our team in Hyderabad. The ideal candidate will have a strong background in database systems, both relational and NoSQL, and a passion for mentoring and training aspiring database professionals. You will be responsible for delivering engaging, interactive, and industry-relevant training sessions on core database concepts, administration, optimization, and real-world applications. Key Responsibilities: Curriculum Development: Design, develop, and maintain comprehensive training modules covering SQL (MySQL, PostgreSQL, Oracle) , NoSQL (MongoDB, Cassandra) , database design, normalization, indexing, backup/recovery strategies, and data modeling. Training Delivery: Conduct engaging, in-person classroom and lab sessions on SQL querying, stored procedures, transactions, optimization, security best practices, and cloud DBMS concepts. Hands-On Workshops: Facilitate practical, real-world exercises including schema design , performance tuning , backup/recovery , and managing unstructured data scenarios. Mentorship & Assessment: Evaluate learners through quizzes, assignments, and capstone projects. Provide continuous feedback, interview preparation, and career counseling support. Content Updating: Regularly update course content to reflect industry advancements , including cloud databases, big data integrations, and emerging DBMS technologies. Lab & Tool Management: Set up, manage, and troubleshoot training environments (both on-premises and cloud-based), and work closely with technical teams to ensure seamless training delivery. Required Qualifications: Bachelor's degree in Computer Science, IT, ECE , or a related field. Minimum 2 years of hands-on experience in database development, administration, or technical training roles. Technical Skills: SQL Databases: MySQL, PostgreSQL, Oracle (queries, joins, transactions, stored procedures) NoSQL Databases: MongoDB, Cassandra (document modeling, indexing) Database Design & Administration: ER modeling, normalization, indexing, backup & recovery, security management Performance Tuning: Query optimization, indexing strategies, monitoring and logging tools Data Modeling: Relational and unstructured/NoSQL data structures Basic Cloud DBMS: Familiarity with AWS RDS, Azure SQL, Firebase/Firestore Version Control & Scripting: Git, basic shell/SQL scripts for automation Communication & Mentoring: Strong presentation, troubleshooting, and feedback skills Preferred Extras: Certifications such as Oracle OCA , AWS/Azure database certifications , MongoDB Certified Developer Experience with big data tools (Hive, Spark SQL) or cloud-native data platforms Experience using Learning Management Systems (LMS) and e-learning platforms

Posted 1 month ago

Apply

2.0 - 3.0 years

15 Lacs

India

Remote

We are seeking a skilled and detail-oriented PostgreSQL Database Developer & Designer to join our team. The ideal candidate will be responsible for designing, developing, optimizing, and maintaining scalable and secure PostgreSQL databases that support our application and business needs. Key Responsibilities: Design and develop efficient and scalable database schemas, tables, views, indexes, and stored procedures Develop and optimize complex SQL queries , functions, and triggers in PostgreSQL Perform data modeling and create ER diagrams to support business logic and performance Work closely with application developers to design and implement data access patterns Monitor database performance and tune queries for high availability and efficiency Maintain data integrity, quality, and security across all environments Develop and manage ETL processes, migrations, and backup strategies Assist in database version control and deployment automation Troubleshoot and resolve database-related issues in development and production Required Skills & Qualifications: Minimum 2–3 years of experience in PostgreSQL database development and design Strong understanding of relational database design principles , normalization, and indexing Proficient in writing complex SQL queries , functions, stored procedures, and performance tuning Experience with data modeling tools (e.g., pgModeler, dbdiagram.io, ER/Studio) Familiarity with database version control (e.g., Liquibase, Flyway) Solid understanding of PostgreSQL internals , query planner, and performance optimization techniques Knowledge of data security , encryption, and compliance standards Strong problem-solving skills and attention to detail Nice to Have (Pluses): Experience with cloud databases (e.g., Amazon RDS for PostgreSQL, Google Cloud SQL, Azure Database for PostgreSQL) Familiarity with NoSQL or hybrid data architectures Exposure to Kafka , RabbitMQ , or other message brokers Experience working in Agile/Scrum teams Knowledge of CI/CD pipelines for database deployments Understanding of data warehousing and analytics/reporting workflows What We Offer: Competitive compensation package Opportunity to work on high-impact systems and large-scale databases Collaborative team environment with growth and learning opportunities Remote-friendly and flexible work schedule Job Type: Full-time Pay: ₹1,500,000.00 per year Benefits: Health insurance Schedule: Day shift Experience: PostgreSQL: 5 years (Required) SQL: 5 years (Required) Work Location: In person Application Deadline: 05/07/2025 Expected Start Date: 01/08/2025

Posted 1 month ago

Apply

0 years

0 Lacs

Hyderabad, Telangana, India

On-site

Job Description: We are seeking a skilled Data Engineer with strong experience in Python, Snowflake, and AWS. The ideal candidate will be responsible for building and optimizing scalable data pipelines, integrating diverse data sources, and supporting analytics and business intelligence solutions in a cloud environment. A key focus will include designing and managing AWS Glue Jobs and enabling efficient, serverless ETL workflows. Key Responsibilities: Design and implement robust data pipelines using AWS Glue, Lambda, and Python. Work extensively with Snowflake for data warehousing, modelling, and analytics support. Manage ETL/ELT jobs using AWS Glue and ensure end-to-end data reliability. Migrate data between CRM systems, especially from Snowflake to Salesforce, following defined business rules and ensuring data accuracy. Optimize SQL/SOQL queries, handle large volumes of data and maintain high levels of performance. Implement data normalization and data quality checks to ensure accurate, consistent, and deduplicated records. Required Skills: Strong programming skills in Python . Hands-on experience with Snowflake Data Warehouse . Proficiency in AWS services : Glue, S3, Lambda, Redshift, CloudWatch. Experience with ETL/ELT pipelines and data integration using AWS Glue Jobs. Proficient in SQL and SOQL for data extraction and transformation. Understanding of data modelling, normalization, and performance optimization. Nice to Have: Familiarity with Salesforce Data Loader, ETL mapping, and metadata-driven migration. Experience with CI/CD tools, DevOps, and version control (e.g., Git). Worked in Agile/Scrum environments.

Posted 1 month ago

Apply
cta

Start Your Job Search Today

Browse through a variety of job opportunities tailored to your skills and preferences. Filter by location, experience, salary, and more to find your perfect fit.

Job Application AI Bot

Job Application AI Bot

Apply to 20+ Portals in one click

Download Now

Download the Mobile App

Instantly access job listings, apply easily, and track applications.

Featured Companies