Get alerts for new jobs matching your selected skills, preferred locations, and experience range. Manage Job Alerts
5.0 years
0 Lacs
Hyderābād
On-site
Department Information Technology Job posted on Jun 26, 2025 Employee Type Permanent Experience range (Years) 5 years - 10 years Job Location Hyderabad Role Title Oracle DBA Administrator Role Purpose The purpose of the Business Development role is to identify, create, and nurture growth opportunities for the organization by building strategic relationships, expanding market presence, and driving revenue generation . The role involves proactively identifying new business prospects, developing tailored solutions to meet client needs, and working collaboratively across teams to close deals and foster long-term partnerships. Business Development professionals act as the bridge between market opportunities and the company's strategic goals , ensuring sustained business growth, competitive advantage, and customer success. Key Accountability Area Database Administration: Good knowledge on oracle 11g, 12c, 19c databases. Good knowledge on Structured Query Language (SQL) Comprehensive knowledge and hands-on experience in managing Oracle and MySQL Databases. Skill in optimizing database queries for better performance and understanding the importance of indexing, normalization, and denormalization. Minimize database downtime and manage parameters to provide fast query responses Monitoring databases and related systems to ensure optimized performance. Monitor database performance, implement changes and apply new patches and versions when required Exposure to Middleware (Oracle Forms and Reports) Applications would be a Significant Plus. System Monitoring and Maintenance: Perform regular system monitoring, verify the integrity and availability of Database, server resources, systems, and key processes, and review System and Application logs. Patch Management: Apply DB and OS patches and upgrades regularly and upgrade administrative tools and utilities. Configure and add new services as necessary. Troubleshooting and Support: Provide technical support and troubleshooting for server-related issues, ensuring minimal downtime and disruption. Backup and Recovery: Manage backup and recovery solutions for servers to ensure data integrity and availability. Documentation and Reporting: Maintain comprehensive documentation of systems, configurations, procedures, and changes. Provide regular reports on system performance and incidents. Reports to Lead- DBA No. of Reportees Individual Contributor Qualification Bachelor’s degree in computer science, Information Technology, or a related field. Work Experience Minimum of 2+ years of experience in Database administration. Proven expertise in managing complex Database environment’s Experience with Linux and Windows server OS. Technical / Functional Competencies Proficiency in Linux Server operating systems and technologies (RHEL, CentOS, Oracle Linux). Proficiency in Windows Server operating systems and technologies (Windows Server 2016, 2019,2022). Exposure to Oracle and AWS cloud platforms. Behavioral Competencies Excellent problem-solving and troubleshooting skills. Strong communication and interpersonal skills. Ability to work independently and as part of a team. Attention to detail and strong organizational skills.
Posted 1 month ago
5.0 years
0 Lacs
Mumbai Metropolitan Region
Remote
Company Description Forbes Advisor is a new initiative for consumers under the Forbes Marketplace umbrella that provides journalist- and expert-written insights, news and reviews on all things personal finance, health, business, and everyday life decisions. We do this by providing consumers with the knowledge and research they need to make informed decisions they can feel confident in, so they can get back to doing the things they care about most. The Data Research Engineering Team is a brand new team with the purpose of managing data from acquisition to presentation, collaborating with other teams while also operating independently. Their responsibilities include acquiring and integrating data, processing and transforming it, managing databases, ensuring data quality, visualizing data, automating processes, working with relevant technologies, and ensuring data governance and compliance. They play a crucial role in enabling data-driven decision-making and meeting the organization's data needs. A typical day in the life of a Database Engineer/Developer will involve designing, developing, and maintaining a robust and secure database infrastructure to efficiently manage company data. They collaborate with cross-functional teams to understand data requirements and migrate data from spreadsheets or other sources to relational databases or cloud-based solutions like Google BigQuery and AWS. They develop import workflows and scripts to automate data import processes, optimize database performance, ensure data integrity, and implement data security measures. Their creativity in problem-solving and continuous learning mindset contribute to improving data engineering processes. Proficiency in SQL, database design principles, and familiarity with Python programming are key qualifications for this role. Job Description Key Responsibilities Design, build, and maintain scalable and secure relational and cloud-based database systems. Migrate data from spreadsheets or third-party sources into databases (PostgreSQL, MySQL, BigQuery). Create and maintain automated workflows and scripts for reliable, consistent data ingestion. Optimize query performance and indexing to improve data retrieval efficiency. Implement access controls, encryption, and data security best practices to ensure compliance. Monitor database health and troubleshoot issues proactively using appropriate tools. Collaborate with full-stack developers and data researchers to align data architecture with application needs. Uphold data quality through validation rules, constraints, and referential integrity checks. Keep up-to-date with emerging technologies and propose improvements to data workflows. Leverage tools like Python (Pandas, SQLAlchemy, PyDrive), and version control (Git). Support Agile development practices and CI/CD pipelines where applicable. Required Skills And Experience Strong SQL skills and understanding of database design principles (normalization, indexing, relational integrity). Experience with relational databases such as PostgreSQL or MySQL. Working knowledge of Python, including data manipulation and scripting (e.g., using Pandas, SQLAlchemy). Experience with data migration and ETL processes, including integrating data from spreadsheets or external sources. Understanding of data security best practices, including access control, encryption, and compliance. Ability to write and maintain import workflows and scripts to automate data ingestion and transformation. Experience with cloud-based databases, such as Google BigQuery or AWS RDS. Familiarity with cloud services (e.g., AWS Lambda, GCP Dataflow) and serverless data processing. Exposure to data warehousing tools like Snowflake or Redshift. Experience using monitoring tools such as Prometheus, Grafana, or the ELK Stack. Good analytical and problem-solving skills, with strong attention to detail. Team collaboration skills, especially with developers and analysts, and ability to work independently. Proficiency with version control systems (e.g., Git). Strong communication skills — written and verbal. Preferred / Nice-to-Have Skills Bachelor’s degree in Computer Science, Information Systems, or a related field. Experience working with APIs for data ingestion and third-party system integration. Familiarity with CI/CD pipelines (e.g., GitHub Actions, Jenkins). Python experience using modules such as gspread, PyDrive, PySpark, or object-oriented design patterns. Experience in Agile/Scrum teams or working with product development cycles. Experience using Tableau and Tableau Prep for data visualization and transformation. Why Join Us Monthly long weekends — every third Friday off Wellness reimbursement to support your health and balance Paid parental leave Remote-first with flexibility and trust Work with a world-class data and marketing team inside a globally recognized brand Qualifications 5+ Years exp in Database Engineering. Additional Information Perks: Day off on the 3rd Friday of every month (one long weekend each month) Monthly Wellness Reimbursement Program to promote health well-being Monthly Office Commutation Reimbursement Program Paid paternity and maternity leaves
Posted 1 month ago
0 years
4 - 9 Lacs
Noida
On-site
Genpact (NYSE: G) is a global professional services and solutions firm delivering outcomes that shape the future. Our 125,000+ people across 30+ countries are driven by our innate curiosity, entrepreneurial agility, and desire to create lasting value for clients. Powered by our purpose – the relentless pursuit of a world that works better for people – we serve and transform leading enterprises, including the Fortune Global 500, with our deep business and industry knowledge, digital operations services, and expertise in data, technology, and AI. Inviting applications for the role of Lead Consultant – Java & GCP Developer In this role, you will be responsible for Developing Microsoft Access Databases, including tables, queries, forms and reports, using standard IT processes, with data normalization and referential integrity. Responsibilities Experience with Spring Boot Must have GCP Experience Experience with Microservices development Extensive Experience working with JAVA API with Oracle is critical. Extensive experience in Java 11 SE Experience with unit testing frameworks Junit or Mockito Experience with Maven/Gradle Professional, precise communication skills Experience in API designing, troubleshooting, and tuning for performance Professional, precise communication skills Experience designing, troubleshooting, API Java services and microservices Qualifications we seek in you! Minimum Qualifications BE /B.Tech/M.Tech/MCA Preferred qualifications Experience with Oracle 11g or 12c pl/sql is preferred Experience in health care or pharmacy related industries is preferred. Familiarity with Toad and/or SQL Developer tools Experience working with Angular, Spring Boot frame as well Experience with Kubernetes, Azure cloud Genpact is an Equal Opportunity Employer and considers applicants for all positions without regard to race, color, religion or belief, sex, age, national origin, citizenship status, marital status, military/veteran status, genetic information, sexual orientation, gender identity, physical or mental disability or any other characteristic protected by applicable laws. Genpact is committed to creating a dynamic work environment that values respect and integrity, customer focus, and innovation. For more information, visit www.genpact.com . Follow us on Twitter, Facebook, LinkedIn, and YouTube. Furthermore, please do note that Genpact does not charge fees to process job applications and applicants are not required to pay to participate in our hiring process in any other way. Examples of such scams include purchasing a 'starter kit,' paying to apply, or purchasing equipment or training. Job Lead Consultant Primary Location India-Noida Schedule Full-time Education Level Bachelor's / Graduation / Equivalent Job Posting Jun 25, 2025, 12:03:20 PM Unposting Date Ongoing Master Skills List Consulting Job Category Full Time
Posted 1 month ago
0 years
0 Lacs
Greater Chennai Area
On-site
Job Title : Snowflake Data Engineer Location : Chennai Job Type : Full Time Job Summary: We are looking for a skilled and detail-oriented Snowflake Data Engineer to join our data engineering team. The ideal candidate should have hands-on experience with Snowflake, DBT, SQL, and any one of the cloud platforms (AWS, Azure, or GCP). Experience or exposure to Python for data transformation or scripting is a plus. Required Skills: Strong experience with Snowflake data warehousing architecture and features. Hands-on expertise in DBT (Data Build Tool) for transformation and modelling. Proficiency in SQL – complex joins, window functions, performance tuning. Experience in at least one major cloud platform: AWS, Azure, or GCP. Knowledge of data modelling (dimensional/star schema, normalization, etc.) Familiarity with CI/CD pipelines for data deployments.
Posted 1 month ago
0 years
0 Lacs
Coimbatore, Tamil Nadu, India
On-site
Job Description The ideal candidate must possess strong communication skills, with an ability to listen and comprehend information and share it with all the key stakeholders, highlighting opportunities for improvement and concerns, if any. He/she must be able to work collaboratively with teams to execute tasks within defined timeframes while maintaining high-quality standards and superior service levels. The ability to take proactive actions and willingness to take up responsibility beyond the assigned work area is a plus. Apprentice_Analyst Roles and responsibilities: Data enrichment/gap fill, standardization, normalization, and categorization of online and offline product data via research through different sources like internet, specific websites, database, etc. Data quality check and correction Data profiling and reporting (basic) Email communication with the client on request acknowledgment, project status and response on queries Help customers in enhancing their product data quality (electrical, mechanical, electronics) from the technical specification and description perspective Provide technical consulting to the customer category managers around the industry best practices of product data enhancement Technical and Functional Skills: Bachelor’s Degree in Engineering from Electrical, Mechanical OR Electronics stream Excellent technical knowledge of engineering products (Pumps, motors, HVAC, Plumbing, etc.) and technical specifications Intermediate knowledge of MS Office/Internet.
Posted 1 month ago
6.0 years
0 Lacs
Kolkata, West Bengal, India
On-site
JOB_POSTING-3-71879-1 Job Description Role Title : AVP, Enterprise Logging & Observability (L11) Company Overview Synchrony (NYSE: SYF) is a premier consumer financial services company delivering one of the industry’s most complete digitally enabled product suites. Our experience, expertise and scale encompass a broad spectrum of industries including digital, health and wellness, retail, telecommunications, home, auto, outdoors, pet and more. We have recently been ranked #2 among India’s Best Companies to Work for by Great Place to Work. We were among the Top 50 India’s Best Workplaces in Building a Culture of Innovation by All by GPTW and Top 25 among Best Workplaces in BFSI by GPTW. We have also been recognized by AmbitionBox Employee Choice Awards among the Top 20 Mid-Sized Companies, ranked #3 among Top Rated Companies for Women, and Top-Rated Financial Services Companies. Synchrony celebrates ~51% women diversity, 105+ people with disabilities, and ~50 veterans and veteran family members. We offer Flexibility and Choice for all employees and provide best-in-class employee benefits and programs that cater to work-life integration and overall well-being. We provide career advancement and upskilling opportunities, focusing on Advancing Diverse Talent to take up leadership roles. Organizational Overview Splunk is Synchrony's enterprise logging solution. Splunk searches and indexes log files and helps derive insights from the data. The primary goal is, to ingests massive datasets from disparate sources and employs advanced analytics to automate operations and improve data analysis. It also offers predictive analytics and unified monitoring for applications, services and infrastructure. There are many applications that are forwarding data to the Splunk logging solution. Splunk team including Engineering, Development, Operations, Onboarding, Monitoring maintain Splunk and provide solutions to teams across Synchrony. Role Summary/Purpose The role AVP, Enterprise Logging & Observability is a key leadership role responsible for driving the strategic vision, roadmap, and development of the organization’s centralized logging and observability platform. This role supports multiple enterprise initiatives including applications, security monitoring, compliance reporting, operational insights, and platform health tracking. This role lead platform development using Agile methodology, manage stakeholder priorities, ensure logging standards across applications and infrastructure, and support security initiatives. This position bridges the gap between technology teams, applications, platforms, cloud, cybersecurity, infrastructure, DevOps, Governance audit, risk teams and business partners, owning and evolving the logging ecosystem to support real-time insights, compliance monitoring, and operational excellence. Key Responsibilities Splunk Development & Platform Management Lead and coordinate development activities, ingestion pipeline enhancements, onboarding frameworks, and alerting solutions. Collaborate with engineering, operations, and Splunk admins to ensure scalability, performance, and reliability of the platform. Establish governance controls for source naming, indexing strategies, retention, access controls, and audit readiness. Splunk ITSI Implementation & Management - Develop and configure ITSI services, entities, and correlation searches. Implement notable events aggregation policies and automate response actions. Fine-tune ITSI performance by optimizing data models, summary indexing, and saved searches. Help identify patterns and anomalies in logs and metrics. Develop ML models for anomaly detection, capacity planning, and predictive analytics. Utilize Splunk MLTK to build and train models for IT operations monitoring. Security & Compliance Enablement Partner with InfoSec, Risk, and Compliance to align logging practices with regulations (e.g., PCI-DSS, GDPR, RBI). Enable visibility for encryption events, access anomalies, secrets management, and audit trails. Support security control mapping and automation through observability. Stakeholder Engagement Act as a strategic advisor and point of contact for business units, application, infrastructure, security stakeholders and business teams leveraging Splunk. Conduct stakeholder workshops, backlog grooming, and sprint reviews to ensure alignment. Maintain clear and timely communications across all levels of the organization. Process & Governance Drive logging and observability governance standards, including naming conventions, access controls, and data retention policies. Lead initiatives for process improvement in log ingestion, normalization, and compliance readiness. Ensure alignment with enterprise architecture and data classification models. Lead improvements in logging onboarding lifecycle time, automation pipelines, and selfservice ingestion tools. Mentor junior team members and guide engineering teams on secure, standardized logging practices. Required Skills/Knowledge Bachelor's degree with Minimum of 6+ years of experience in Technology ,or in lieu of a degree 8+ years of Experience in Technology Minimum of 3+ years of experience in leading development team or equivalent role in observability, logging, or security platforms. Splunk Subject Matter Expert (SME) Strong hands-on understanding of Splunk architecture, pipelines, dashboards, and alerting, data ingestion, search optimization, and enterprise-scale operations. Experience supporting security use cases, encryption visibility, secrets management, and compliance logging. Splunk Development & Platform Management, Security & Compliance Enablement, Stakeholder Engagement & Process & Governance Experience with Splunk Premium Apps - ITSI and Enterprise Security (ES) minimally Experience with Data Streaming Platforms & tools like Cribl, Splunk Edge Processor. Proven ability to work in Agile environments using tools such as JIRA or JIRA Align. Strong communication, leadership, and stakeholder management skills. Familiarity with security, risk, and compliance standards relevant to BFSI. Proven experience leading product development teams and managing cross-functional initiatives using Agile methods. Strong knowledge and hands-on experience with Splunk Enterprise/Splunk Cloud. Design and implement Splunk ITSI solutions for proactive monitoring and service health tracking. Develop KPIs, Services, Glass Tables, Entities, Deep Dives, and Notable Events to improve service reliability for users across the firm Develop scripts (python, JavaScript, etc.) as needed in support of data collection or integration Develop new applications leveraging Splunk’s analytic and Machine Learning tools to maximize performance, availability and security improving business insight and operations. Support senior engineers in analyzing system issues and performing root cause analysis (RCA). Desired Skills/Knowledge Deep knowledge of Splunk development, data ingestion, search optimization, alerting, dashboarding, and enterprise-scale operations. Exposure to SIEM integration, security orchestration, or SOAR platforms. Knowledge of cloud-native observability (e.g. AWS/GCP/Azure logging). Experience in BFSI or regulated industries with high-volume data handling. Familiarity with CI/CD pipelines, DevSecOps integration, and cloud-native logging. Working knowledge of scripting or automation (e.g., Python, Terraform, Ansible) for observability tooling. Splunk certifications (Power User, Admin, Architect, or equivalent) will be an advantage . Awareness of data classification, retention, and masking/anonymization strategies. Awareness of integration between Splunk and ITSM or incident management tools (e.g., ServiceNow, PagerDuty) Experience with Version Control tools – Git, Bitbucket Eligibility Criteria Bachelor's degree with Minimum of 6+ years of experience in Technology ,or in lieu of a degree 8+ years of Experience in Technology Minimum of 3+ years of experience in leading development team or equivalent role in observability, logging, or security platforms. Demonstrated success in managing large-scale logging platforms in regulated environments. Excellent communication, leadership, and cross-functional collaboration skills. Experience with scripting languages such as Python, Bash, or PowerShell for automation and integration purposes. Prior experience in large-scale, security-driven logging or observability platform development. Excellent problem-solving skills and the ability to work independently or as part of a team. Strong communication and interpersonal skills to interact effectively with team members and stakeholders. Knowledge of IT Service Management (ITSM) and monitoring tools. Knowledge of other data analytics tools or platforms is a plus. WORK TIMINGS : 01:00 PM to 10:00 PM IST This role qualifies for Enhanced Flexibility and Choice offered in Synchrony India and will require the incumbent to be available between 06:00 AM Eastern Time – 11:30 AM Eastern Time (timings are anchored to US Eastern hours and will adjust twice a year locally). This window is for meetings with India and US teams. The remaining hours will be flexible for the employee to choose. Exceptions may apply periodically due to business needs. Please discuss this with the hiring manager for more details. For Internal Applicants Understand the criteria or mandatory skills required for the role, before applying Inform your manager and HRM before applying for any role on Workday Ensure that your professional profile is updated (fields such as education, prior experience, other skills) and it is mandatory to upload your updated resume (Word or PDF format) Must not be any corrective action plan (First Formal/Final Formal, PIP) L9+ Employees who have completed 18 months in the organization and 12 months in current role and level are only eligible. L09+ Employees can apply. Level / Grade : 11 Job Family Group Information Technology
Posted 1 month ago
5.0 years
0 Lacs
Greater Chennai Area
On-site
Job Overview Plan A Technologies is looking for an MS SQL Server DB develope r . This is a fast-paced job with room for significant career growth. Please note: you must have at least 5+ years of experience as a MS SQL Server Developer or Database Developer to be considered for this role. JOB RESPONSIBILITY Develop, maintain, and optimize database solutions using SQL Server. Write efficient T-SQL queries, stored procedures, triggers, and functions. Perform database schema design, normalization, and optimization. Collaborate with developers, analysts, and stakeholders to understand database requirements. Optimize database performance through query optimization and indexing. Troubleshoot and resolve database issues such as performance bottlenecks and data corruption. Participate in code reviews, testing, and deployment activities. Stay updated on emerging database technologies and trends. Experience 5-7 years of experience as a MS SQL Server Developer or Database Developer. Proficiency in T-SQL and experience with SQL Server versions (2012/2014/2016/2019). Strong understanding of database concepts including normalization, indexing, and transactions. Experience with database administration tasks such as backup, recovery, and security. Familiarity with ETL tools for data integration (e.g., SSIS, Azure Data Factory). Knowledge in SSRS is an advantage. Excellent problem-solving skills and attention to detail. Excellent communication skills: must have at least Upper-Intermediate-level English (both verbal and written) Advanced problem-solving abilities, research, and learning skills Ability to work with engineers in multiple countries Must have an organized and analytical working style, and the ability to plan your own work Initiative and drive to do great things About The Company/Benefits Plan A Technologies is an American software development and technology advisory firm that brings top-tier engineering talent to clients around the world. Our software engineers tackle custom product development projects, staff augmentation, major integrations and upgrades, and much more. The team is far more hands-on than the giant outsourcing shops, but still big enough to handle major enterprise clients. Read more about us here: www.PlanAtechnologies.com . Location: Chennai, India – Hybrid work schedule: you will be required to work from our Chennai office for a minimum of 2 weeks per month. Great colleagues and an upbeat work environment: You'll join an excellent team of supportive engineers and project managers who work hard but don't ever compete with each other. Benefits: Vacation, Brand New Laptop, and More: You’ll get a generous vacation schedule, and other goodies. If this sounds like you, we'd love to hear from you!
Posted 1 month ago
3.0 years
0 Lacs
Greater Kolkata Area
On-site
Project Role : Program/Project Management Representativ Project Role Description : Deliver business and technology outcomes for assigned program, project, or contracted service. Leverage standard tools, methodologies and processes to deliver, monitor, and control service level agreements. Must have skills : Laboratory Information and Execution Systems Good to have skills : Life Sciences Minimum 3 Year(s) Of Experience Is Required Educational Qualification : 15 years full time education Summary: LabVantage, Design, develop, and maintain software applications using Laboratory Information Management System (LIMS). Collaborate with cross-functional teams to ensure seamless integration with other IT components. Conduct rigorous system testing and troubleshooting to optimize the performance of software applications. Provide expert technical guidance and support to project teams throughout the implementation lifecycle. Ensure compliance with software development standards and best practices Roles & Responsibilities: - As an LabVantage, application Developer, your day-to-day activities will revolve around leveraging your advanced proficiency in Laboratory Information Management System (LIMS) to develop and maintain software applications. - You'll be responsible for designing, coding, testing, and debugging software applications. - You'll be entrusted with the task of ensuring seamless integration with other IT components, thus playing a significant role in contributing to the organization's overall success. - You must have advanced proficiency in Laboratory Information Management System (LIMS). - Having intermediate proficiency in Configuration & Release Management and advanced proficiency in Design & Build Enablement will be advantageous. - Expected to perform independently and become an SME. - Required active participation/contribution in team discussions. - Contribute in providing solutions to work-related problems. - Collaborate with stakeholders to define project objectives and scope. - Develop and maintain project plans, including timelines, budgets, and resource allocation. - Monitor project progress and ensure adherence to timelines and deliverables. - Identify and mitigate project risks and issues. Professional & Technical Skills: - Must To Have Skills: Proficiency in Laboratory Information and Execution Systems. - Strong understanding of statistical analysis and machine learning algorithms. - Experience with data visualization tools such as Tableau or Power BI. - Hands-on implementing various machine learning algorithms such as linear regression, logistic regression, decision trees, and clustering algorithms. - Solid grasp of data munging techniques, including data cleaning, transformation, and normalization to ensure data quality and integrity. Additional Information: - The candidate should have a minimum of 3 years of experience in Laboratory Information and Execution Systems. - This position is based at our Bengaluru office. - A 15 years full-time education is required. 15 years full time education
Posted 1 month ago
4.0 - 8.0 years
0 Lacs
Bengaluru, Karnataka, India
On-site
Job Details Category: Data Science Location : Bangalore Experience Level: 4-8 Years Position Description We are looking for a Data Engineer who will play a pivotal role in transforming raw data into actionable intelligence through sophisticated data pipelines and machine learning deployment frameworks. They will collaborate across functions to understand business objectives, engineer data solutions, and ensure robust AI/ML model deployment and monitoring. This role is ideal for someone passionate about data science, MLOps, and building scalable data ecosystems in cloud environments. Key Responsibilities Data Engineering & Data Science: Preprocess structured and unstructured data to prepare for AI/ML model development. Apply strong skills in feature engineering, data augmentation, and normalization techniques. Manage and manipulate data using SQL, NoSQL, and cloud-based data storage solutions such as Azure Data Lake. Design and implement efficient ETL pipelines, data wrangling, and data transformation strategies. Model Deployment & MLOps Deploy ML models into production using Azure Machine Learning (Azure ML) and Kubernetes. Implement MLOps best practices, including CI/CD pipelines, model versioning, and monitoring frameworks. Design mechanisms for model performance monitoring, alerting, and retraining. Utilize containerization technologies (Docker/Kubernetes) to support deployment and scalability Business & Analytics Insights Work closely with stakeholders to understand business KPIs and decision-making frameworks. Analyze large datasets to identify trends, patterns, and actionable insights that inform business strategies. Develop data visualizations using tools like Power BI, Tableau, and Matplotlib to communicate insights effectively. Conduct A/B testing and evaluate model performance using metrics such as precision, recall, F1-score, MSE, RMSE, and model validation techniques. Desired Profile Proven experience in data engineering, AI/ML data preprocessing, and model deployment. Strong expertise in working with both structured and unstructured datasets. Hands-on experience with SQL, NoSQL databases, and cloud data platforms (e.g., Azure Data Lake). Deep understanding of MLOps practices, containerization (Docker/Kubernetes), and production-level model deployment. Technical Skills Proficient in ETL pipeline creation, data wrangling, and transformation methods. Strong experience with Azure ML, Kubernetes, and other cloud-based deployment technologies. Excellent knowledge of data visualization tools (Power BI, Tableau, Matplotlib). Expertise in model evaluation and testing techniques, including A/B testing and performance metrics. Soft Skills Strong analytical mindset with the ability to solve complex data-related problems. Ability to collaborate with cross-functional teams to understand business needs and provide actionable insights. Clear communication skills to convey technical details to non-technical stakeholders. If you are passionate to work in a collaborative and challenging environment, apply now!
Posted 1 month ago
5.0 years
0 Lacs
Hyderabad, Telangana, India
On-site
Job Title – Data Engineer (SQL Server, Python, AWS, ETL) Preferred Location: Hyderabad, India Full time/Part Time - Full Time Build a career with confidence Carrier Global Corporation, global leader in intelligent climate and energy solutions is committed to creating solutions that matter for people and our planet for generations to come. From the beginning, we've led in inventing new technologies and entirely new industries. Today, we continue to lead because we have a world-class, diverse workforce that puts the customer at the center of everything we do. Role Description Will work with high-performance software engineering and Analytics teams that consistently deliver on commitments with continuous quality and efficiency improvements. In this role, you will develop technical capabilities for several of Carrier’s software development teams, supporting both current and next-generation technology initiatives. This position requires a demonstrated, hands-on technical person with the ability delivery technical tasks and owns development phase of software development, including coding, troubleshooting, deployment, and ongoing maintenance. Role Responsibilities Design, develop, and implement SQL Server databases based on business requirements and best practices. Create database schema, tables, views, stored procedures, and functions to support application functionality and data access. Ensure data integrity, security, and performance through proper database design and normalization techniques. Analyze query execution plans and performance metrics to identify and address performance bottlenecks. Implement indexing strategies and database optimizations to improve query performance. Design and implement ETL processes to extract, transform, and load data from various sources into SQL Server databases. Document database configurations, performance tuning activities, and Power BI solutions for knowledge sharing and future reference. Provide training and support to end-users on SQL Server best practices, database performance optimization techniques, and Power BI usage. Minimum Requirements BTech degree in Computer Science or related discipline, MTech degree preferred. Assertive communication, strong analytical, problem solving, debugging, and leadership skills. Experience with source control tools like Bit Bucket and/or Git. Good Hands-on experience diagnosing performance bottlenecks, wait stats, SQL query monitoring, review and optimization strategies. Create normalized and highly scalable logical and physical database design and switch between different database technologies like Oracle, SQL Server, Elastic databases. 5+ years of overall experience building and maintaining SQL server and data engineering for the organization. 5+ year SQL server development experience with strong programming experience in writing stored procedures and functions. Excellent understanding of Snowflake and other data warehouses. Experience in designing and hands-on development in cloud-based analytics solutions. Understanding on AWS storage services and AWS Cloud Infrastructure offerings. Designing and building data pipelines using API ingestion and Streaming ingestion methods. Knowledge of Dev-Ops processes (including CI/CD) and Infrastructure as code is essential. Benefits We are committed to offering competitive benefits programs for all of our employees, and enhancing our programs when necessary. Have peace of mind and body with our health insurance Make yourself a priority with flexible schedules and leave Policy Drive forward your career through professional development opportunities Achieve your personal goals with our Employee Assistance Program. Our commitment to you Our greatest assets are the expertise, creativity and passion of our employees. We strive to provide a great place to work that attracts, develops and retains the best talent, promotes employee engagement, fosters teamwork and ultimately drives innovation for the benefit of our customers. We strive to create an environment where you feel that you belong, with diversity and inclusion as the engine to growth and innovation. We develop and deploy best-in-class programs and practices, providing enriching career opportunities, listening to employee feedback and always challenging ourselves to do better. This is The Carrier Way. Join us and make a difference. Apply Now! Carrier is An Equal Opportunity/Affirmative Action Employer. All qualified applicants will receive consideration for employment without regard to race, color, religion, sex, sexual orientation, gender identity, national origin, disability or veteran status, age or any other federally protected class. Job Applicant's Privacy Notice Click on this link to read the Job Applicant's Privacy Notice
Posted 1 month ago
0 years
0 Lacs
Coimbatore, Tamil Nadu, India
On-site
Job Description The ideal candidate must possess strong communication skills, with an ability to listen and comprehend information and share it with all the key stakeholders, highlighting opportunities for improvement and concerns, if any. He/she must be able to work collaboratively with teams to execute tasks within defined timeframes while maintaining high-quality standards and superior service levels. The ability to take proactive actions and willingness to take up responsibility beyond the assigned work area is a plus. Apprentice_Analyst Roles and responsibilities: Data enrichment/gap fill, standardization, normalization, and categorization of online and offline product data via research through different sources like internet, specific websites, database, etc. Data quality check and correction Data profiling and reporting (basic) Email communication with the client on request acknowledgment, project status and response on queries Help customers in enhancing their product data quality (electrical, mechanical, electronics) from the technical specification and description perspective Provide technical consulting to the customer category managers around the industry best practices of product data enhancement Technical And Functional Skills Bachelor’s Degree in Engineering from Electrical, Mechanical OR Electronics stream Excellent technical knowledge of engineering products (Pumps, motors, HVAC, Plumbing, etc.) and technical specifications Intermediate knowledge of MS Office/Internet.
Posted 1 month ago
6.0 years
0 Lacs
Pune, Maharashtra, India
On-site
Job Description Role : Team Leader - Service Desk Location : Pune/Bangalore Job Summary – Candidates with a minimum 6 years of Service Desk experience with minimum 2 years in Front Line Leadership / Management role– We are looking for candidates with domain expertise in End User Support Services, and skilled in technical troubleshooting and delivery operations management. Passport (Mandate); Advantage - US business visa (B1) Years of experience needed – 5-8 years Technical Skills Analytical skills Effective Business Communication Coaching skills Operations Management SLA Management MS Office Operational knowledge of contact center platform and ITSM tool Performance Management skills Conflict management skills Capacity management Presentation skills Training need identification Technical Skills-Client Technical Service Awareness – Intermediate Technical Troubleshooting - Account Management/password reset - Advance. Technical Troubleshooting - OS – Advance Technical Troubleshooting - End Devices - Advance Ticketing Tool – Advance MS Office – Intermediate Contact center platform operating skills – Intermediate. Contact center platform reports – Intermediate. Networking concepts – Intermediate Client Process Knowledge – Advanced DMAIC Methodology – Intermediate Client Business Awareness – Advanced Telephone etiquette – Expert. Email etiquette – Expert. Customer service skills – Expert Knowledge Base Navigation Skills – Advanced Analytical skills – Intermediate Operations Management – Advanced SLA Management – Intermediate Effective Business Communication – Advance Decision Making Skills – Advance Measuring Performance/Performance Management Skills – Advance Coaching for Success – Advance Motivating Others – Advance Conflict Management Skills – Advance Patience – Advance Managing Stress – Advance Positive attitude to change – Advance. Attitude to feedback/willing to learn – Advance. Relating to Others – Advance Influencing Others – Advance Team Player – Advance Insight into the Customer's Mindset – Advance Solution Based Approach – Advance Follow Through – Advance Personal Credibility – Advance Self-Development – Intermediate Result Focus – Intermediate Drive to Win – Intermediate Recognize Efforts – Advanced Approachability – Advanced Dealing with Fairness – Expert Fostering Teamwork - Advanced Management Skills Supervise and review Service Desk activities. Review and ensure compliance to standards like PCI, ISO, ISMS, BCMS by facilitating audits by internal and external teams. Place hiring request and conducting interviews. Work with HR and support groups to improve employee retention and satisfaction. In-person feedback to reporting agents on daily basis regarding ticket hygiene and operational/procedural hygiene Root cause analysis, tracking and reporting of escalation and SLA misses. Attend change meetings and analyze potential impact to Service Desk operations. Performance appraisal and normalization Participate in calibration and collaboration meetings with support function leads. Conduct new hire technical and account specific training based on the requirements. Create, maintain, and update account training plan. Provide hands-on assistance to team members in case of issues, both through direct intervention and mentoring Prepare Score Cards and discuss and share feedback around improvement areas. Identify top performers and nominate for Rewards and Recognition and appreciation. Monitor ticket ageing reports and drive team members to work on ageing tickets. FCR analysis - find out controllable resolution errors that could have been resolved at L1. Behavioral Skills Good in communication Positive energy Positive attitude Self-learner Qualification Any Graduate Certification ITIL certified. About Mphasis Mphasis applies next-generation technology to help enterprises transform businesses globally. Customer centricity is foundational to Mphasis and is reflected in the Mphasis’ Front2Back™ Transformation approach. Front2Back™ uses the exponential power of cloud and cognitive to provide hyper-personalized (C=X2C2TM=1) digital experience to clients and their end customers. Mphasis’ Service Transformation approach helps ‘shrink the core’ through the application of digital technologies across legacy environments within an enterprise, enabling businesses to stay ahead in a changing world. Mphasis’ core reference architectures and tools, speed and innovation with domain expertise and specialization are key to building strong relationships with marquee clients.
Posted 1 month ago
7.0 years
0 Lacs
India
Remote
Role Overview: We are looking for a highly skilled and experienced ServiceNow professional (7+ years) to join our freelance technical interview panel . As a Panelist, you’ll play a critical role in assessing candidates for ServiceNow Developer, Admin, and Architect roles by conducting deep technical interviews and evaluating hands-on expertise, problem-solving skills, and platform knowledge. This is an excellent opportunity for technically strong freelancers who enjoy sharing their expertise, influencing hiring decisions, and working flexible hours remotely. Key Responsibilities: Conduct live technical interviews and evaluations over video calls (aligned to EST hours) Assess candidates’ practical expertise in: Core ServiceNow modules (ITSM, CMDB, Discovery, Incident/Change/Problem) Custom application development & configuration Client/Server-side scripting (JavaScript, Business Rules, UI Policies, Script Includes) Integrations (REST/SOAP APIs, Integration Hub) Flow Designer, Service Portal, ACLs, ATF, and CI/CD practices Review coding tasks and scenario-based architecture questions Provide detailed, structured feedback and recommendations to the hiring team Collaborate on refining technical evaluation criteria if needed Required Skills & Experience (Advanced Technical Expertise): 10+ years of extensive hands-on experience with the ServiceNow platform in enterprise-grade environments Strong command over ServiceNow Core Modules : ITSM, ITOM, CMDB, Asset & Discovery, Incident/Change/Problem/Knowledge Management Proven expertise in custom application development using scoped apps, App Engine Studio, and Now Experience UI Framework Deep proficiency in ServiceNow scripting , including: Server-side : Business Rules, Script Includes, Scheduled Jobs, GlideRecord, GlideAggregate Client-side : UI Policies, Client Scripts, UI Actions, GlideForm/GlideUser APIs Middleware logic for cross-platform communication and custom handlers Experience implementing Access Control Lists (ACLs) with dynamic filters and condition-based restrictions Expert in Service Portal customization using AngularJS widgets, Bootstrap, and custom REST endpoints Proficient in Integration Hub , Custom REST/SOAP APIs , OAuth 2.0 authentication, MID Server integrations, external system integration (e.g., SAP, Azure, Jira, Dynatrace, etc.) Hands-on with Flow Designer , Orchestration , and Event Management Expertise in ServiceNow CMDB , CI Class modeling, reconciliation rules, identification/normalization strategies, and dependency mappings Familiarity with ServiceNow Performance Tuning : Scheduled Jobs optimization, lazy loading, database indexing, client/server execution efficiency Working knowledge of Automated Test Framework (ATF) and integration with CI/CD pipelines (Jenkins, Git, Azure DevOps) Understanding of ServiceNow DevOps , version control, scoped app publishing, and update set migration best practices Knowledge of Security Operations (SecOps) and Governance, Risk & Compliance (GRC) is a plus Experience guiding architectural decisions, governance models, and platform upgrade strategies Prior experience conducting technical interviews, design evaluations , or acting as a technical SME/panelist Excellent communication and feedback documentation skills — able to clearly explain technical rationale and candidate assessments Comfortable working independently and engaging with global stakeholders during USA EST hours (after 8 PM IST)
Posted 1 month ago
3.0 years
0 Lacs
Hyderabad, Telangana, India
On-site
Description Job summary Amazon.com’s Buyer Risk Prevention (BRP) mission is to make Amazon the safest and most trusted place worldwide to transact online. Amazon runs one of the most dynamic e-commerce marketplaces in the world, with nearly 2 million sellers worldwide selling hundreds of millions of items in ten countries. BRP safeguards every financial transaction across all Amazon sites. As such, BRP designs and builds the software systems, risk models, and operational processes that minimize risk and maximize trust in Amazon.com. The BRP organization is looking for a data scientist for its Risk Mining Analytics (RMA) team, whose mission is to combine advanced analytics with investigator insight to detect negative customer experiences, improve system effectiveness, and prevent bad debt across Amazon. As a data scientist in risk mining, you will be responsible for modeling complex problems, discovering insights, and building risk algorithms that identify opportunities through statistical models, machine learning, and visualization techniques to improve operational efficiency and reduce bad debt. You will need to collaborate effectively with business and product leaders within BRP and cross-functional teams to build scalable solutions against high organizational standards. The candidate should be able to apply a breadth of tools, data sources, and data science techniques to answer a wide range of high-impact business questions and proactively present new insights in a concise and effective manner. The candidate should be an effective communicator capable of independently driving issues to resolution and communicating insights to non-technical audiences. This is a high-impact role with goals that directly impact the bottom line of the business. Key job responsibilities Key job responsibilities Analyze terabytes of data to define and deliver on complex analytical deep dives to unlock insights and build scalable solutions through Data Science to ensure security of Amazon’s platform and transactions Build Machine Learning and/or statistical models that evaluate the transaction legitimacy and track impact over time Ensure data quality throughout all stages of acquisition and processing, including data sourcing/collection, ground truth generation, normalization, transformation, and cross-lingual alignment/mapping Define and conduct experiments to validate/reject hypotheses, and communicate insights and recommendations to Product and Tech teams Develop efficient data querying infrastructure for both offline and online use cases Collaborate with cross-functional teams from multidisciplinary science, engineering and business backgrounds to enhance current automation processes Learn and understand a broad range of Amazon’s data resources and know when, how, and which to use and which not to use. Maintain technical document and communicate results to diverse audiences with effective writing, visualizations, and presentations Basic Qualifications 3+ years of data querying languages (e.g. SQL), scripting languages (e.g. Python) or statistical/mathematical software (e.g. R, SAS, Matlab, etc.) experience 2+ years of data scientist experience 3+ years of machine learning/statistical modeling data analysis tools and techniques, and parameters that affect their performance experience Experience applying theoretical models in an applied environment Preferred Qualifications Experience in Python, Perl, or another scripting language Our inclusive culture empowers Amazonians to deliver the best results for our customers. If you have a disability and need a workplace accommodation or adjustment during the application and hiring process, including support for the interview or onboarding process, please visit https://amazon.jobs/content/en/how-we-hire/accommodations for more information. If the country/region you’re applying in isn’t listed, please contact your Recruiting Partner. Company - Amazon Dev Center India - Hyderabad Job ID: A2998274
Posted 1 month ago
5.0 years
5 - 9 Lacs
Hyderābād
On-site
Hyderabad, Telangana Job ID 30162733 Job Category Digital Technology Job Title – Data Engineer (SQL Server, Python, AWS, ETL) Preferred Location: Hyderabad, India Full time/Part Time - Full Time Build a career with confidence Carrier Global Corporation, global leader in intelligent climate and energy solutions is committed to creating solutions that matter for people and our planet for generations to come. From the beginning, we've led in inventing new technologies and entirely new industries. Today, we continue to lead because we have a world-class, diverse workforce that puts the customer at the center of everything we do. Role Description: Will work with high-performance software engineering and Analytics teams that consistently deliver on commitments with continuous quality and efficiency improvements. In this role, you will develop technical capabilities for several of Carrier’s software development teams, supporting both current and next-generation technology initiatives. This position requires a demonstrated, hands-on technical person with the ability delivery technical tasks and owns development phase of software development, including coding, troubleshooting, deployment, and ongoing maintenance. Role Responsibilities: Design, develop, and implement SQL Server databases based on business requirements and best practices. Create database schema, tables, views, stored procedures, and functions to support application functionality and data access. Ensure data integrity, security, and performance through proper database design and normalization techniques. Analyze query execution plans and performance metrics to identify and address performance bottlenecks. Implement indexing strategies and database optimizations to improve query performance. Design and implement ETL processes to extract, transform, and load data from various sources into SQL Server databases. Document database configurations, performance tuning activities, and Power BI solutions for knowledge sharing and future reference. Provide training and support to end-users on SQL Server best practices, database performance optimization techniques, and Power BI usage. Minimum Requirements: BTech degree in Computer Science or related discipline, MTech degree preferred. Assertive communication, strong analytical, problem solving, debugging, and leadership skills. Experience with source control tools like Bit Bucket and/or Git. Good Hands-on experience diagnosing performance bottlenecks, wait stats, SQL query monitoring, review and optimization strategies. Create normalized and highly scalable logical and physical database design and switch between different database technologies like Oracle, SQL Server, Elastic databases. 5+ years of overall experience building and maintaining SQL server and data engineering for the organization. 5+ year SQL server development experience with strong programming experience in writing stored procedures and functions. Excellent understanding of Snowflake and other data warehouses. Experience in designing and hands-on development in cloud-based analytics solutions. Understanding on AWS storage services and AWS Cloud Infrastructure offerings. Designing and building data pipelines using API ingestion and Streaming ingestion methods. Knowledge of Dev-Ops processes (including CI/CD) and Infrastructure as code is essential. Benefits We are committed to offering competitive benefits programs for all of our employees, and enhancing our programs when necessary. Have peace of mind and body with our health insurance Make yourself a priority with flexible schedules and leave Policy Drive forward your career through professional development opportunities Achieve your personal goals with our Employee Assistance Program. Our commitment to you Our greatest assets are the expertise, creativity and passion of our employees. We strive to provide a great place to work that attracts, develops and retains the best talent, promotes employee engagement, fosters teamwork and ultimately drives innovation for the benefit of our customers. We strive to create an environment where you feel that you belong, with diversity and inclusion as the engine to growth and innovation. We develop and deploy best-in-class programs and practices, providing enriching career opportunities, listening to employee feedback and always challenging ourselves to do better. This is The Carrier Way. Join us and make a difference. Now! Carrier is An Equal Opportunity/Affirmative Action Employer. All qualified applicants will receive consideration for employment without regard to race, color, religion, sex, sexual orientation, gender identity, national origin, disability or veteran status, age or any other federally protected class.
Posted 1 month ago
2.0 years
0 Lacs
India
On-site
Job Title: DBMS Trainer Location: Hyderabad, Telangana Experience Required: Minimum 2 years in database development or training Employment Type: Full-Time, Onsite Job Summary: We are seeking a dynamic and experienced DBMS Trainer to join our team in Hyderabad. The ideal candidate will have a strong background in database systems, both relational and NoSQL, and a passion for mentoring and training aspiring database professionals. You will be responsible for delivering engaging, interactive, and industry-relevant training sessions on core database concepts, administration, optimization, and real-world applications. Key Responsibilities: Curriculum Development: Design, develop, and maintain comprehensive training modules covering SQL (MySQL, PostgreSQL, Oracle) , NoSQL (MongoDB, Cassandra) , database design, normalization, indexing, backup/recovery strategies, and data modeling. Training Delivery: Conduct engaging, in-person classroom and lab sessions on SQL querying, stored procedures, transactions, optimization, security best practices, and cloud DBMS concepts. Hands-On Workshops: Facilitate practical, real-world exercises including schema design , performance tuning , backup/recovery , and managing unstructured data scenarios. Mentorship & Assessment: Evaluate learners through quizzes, assignments, and capstone projects. Provide continuous feedback, interview preparation, and career counseling support. Content Updating: Regularly update course content to reflect industry advancements , including cloud databases, big data integrations, and emerging DBMS technologies. Lab & Tool Management: Set up, manage, and troubleshoot training environments (both on-premises and cloud-based), and work closely with technical teams to ensure seamless training delivery. Required Qualifications: Bachelor's degree in Computer Science, IT, ECE , or a related field. Minimum 2 years of hands-on experience in database development, administration, or technical training roles. Technical Skills: SQL Databases: MySQL, PostgreSQL, Oracle (queries, joins, transactions, stored procedures) NoSQL Databases: MongoDB, Cassandra (document modeling, indexing) Database Design & Administration: ER modeling, normalization, indexing, backup & recovery, security management Performance Tuning: Query optimization, indexing strategies, monitoring and logging tools Data Modeling: Relational and unstructured/NoSQL data structures Basic Cloud DBMS: Familiarity with AWS RDS, Azure SQL, Firebase/Firestore Version Control & Scripting: Git, basic shell/SQL scripts for automation Communication & Mentoring: Strong presentation, troubleshooting, and feedback skills Preferred Extras: Certifications such as Oracle OCA , AWS/Azure database certifications , MongoDB Certified Developer Experience with big data tools (Hive, Spark SQL) or cloud-native data platforms Experience using Learning Management Systems (LMS) and e-learning platforms
Posted 1 month ago
2.0 - 3.0 years
15 Lacs
India
Remote
We are seeking a skilled and detail-oriented PostgreSQL Database Developer & Designer to join our team. The ideal candidate will be responsible for designing, developing, optimizing, and maintaining scalable and secure PostgreSQL databases that support our application and business needs. Key Responsibilities: Design and develop efficient and scalable database schemas, tables, views, indexes, and stored procedures Develop and optimize complex SQL queries , functions, and triggers in PostgreSQL Perform data modeling and create ER diagrams to support business logic and performance Work closely with application developers to design and implement data access patterns Monitor database performance and tune queries for high availability and efficiency Maintain data integrity, quality, and security across all environments Develop and manage ETL processes, migrations, and backup strategies Assist in database version control and deployment automation Troubleshoot and resolve database-related issues in development and production Required Skills & Qualifications: Minimum 2–3 years of experience in PostgreSQL database development and design Strong understanding of relational database design principles , normalization, and indexing Proficient in writing complex SQL queries , functions, stored procedures, and performance tuning Experience with data modeling tools (e.g., pgModeler, dbdiagram.io, ER/Studio) Familiarity with database version control (e.g., Liquibase, Flyway) Solid understanding of PostgreSQL internals , query planner, and performance optimization techniques Knowledge of data security , encryption, and compliance standards Strong problem-solving skills and attention to detail Nice to Have (Pluses): Experience with cloud databases (e.g., Amazon RDS for PostgreSQL, Google Cloud SQL, Azure Database for PostgreSQL) Familiarity with NoSQL or hybrid data architectures Exposure to Kafka , RabbitMQ , or other message brokers Experience working in Agile/Scrum teams Knowledge of CI/CD pipelines for database deployments Understanding of data warehousing and analytics/reporting workflows What We Offer: Competitive compensation package Opportunity to work on high-impact systems and large-scale databases Collaborative team environment with growth and learning opportunities Remote-friendly and flexible work schedule Job Type: Full-time Pay: ₹1,500,000.00 per year Benefits: Health insurance Schedule: Day shift Experience: PostgreSQL: 5 years (Required) SQL: 5 years (Required) Work Location: In person Application Deadline: 05/07/2025 Expected Start Date: 01/08/2025
Posted 1 month ago
0 years
0 Lacs
Hyderabad, Telangana, India
On-site
Job Description: We are seeking a skilled Data Engineer with strong experience in Python, Snowflake, and AWS. The ideal candidate will be responsible for building and optimizing scalable data pipelines, integrating diverse data sources, and supporting analytics and business intelligence solutions in a cloud environment. A key focus will include designing and managing AWS Glue Jobs and enabling efficient, serverless ETL workflows. Key Responsibilities: Design and implement robust data pipelines using AWS Glue, Lambda, and Python. Work extensively with Snowflake for data warehousing, modelling, and analytics support. Manage ETL/ELT jobs using AWS Glue and ensure end-to-end data reliability. Migrate data between CRM systems, especially from Snowflake to Salesforce, following defined business rules and ensuring data accuracy. Optimize SQL/SOQL queries, handle large volumes of data and maintain high levels of performance. Implement data normalization and data quality checks to ensure accurate, consistent, and deduplicated records. Required Skills: Strong programming skills in Python . Hands-on experience with Snowflake Data Warehouse . Proficiency in AWS services : Glue, S3, Lambda, Redshift, CloudWatch. Experience with ETL/ELT pipelines and data integration using AWS Glue Jobs. Proficient in SQL and SOQL for data extraction and transformation. Understanding of data modelling, normalization, and performance optimization. Nice to Have: Familiarity with Salesforce Data Loader, ETL mapping, and metadata-driven migration. Experience with CI/CD tools, DevOps, and version control (e.g., Git). Worked in Agile/Scrum environments.
Posted 1 month ago
2.0 years
3 - 5 Lacs
Noida
On-site
Position: Web Developer We are looking for a highly skilled Web Developer with 2+ years of experience in web-based project development. The successful candidate will be responsible for designing, developing, and implementing web applications using PHP and various open-source frameworks. Key Responsibilities: Collaborate with cross-functional teams to identify and prioritize project requirements Develop and maintain high-quality, efficient, and well-documented code Troubleshoot and resolve technical issues Implement Social Networks Integration, Payment Gateways Integration, and Web 2.0 in web-based projects Work with RDBMS design, normalization, Data modelling, Transactions, and distributed databases Develop and maintain database PL/SQL, stored procedures, and triggers Requirements: 2+ years of experience in web-based project development using PHP Experience with various open-source frameworks such as Laravel, WordPress, Drupal, Joomla, OsCommerce, OpenCart, TomatoCart, VirtueMart, Magento, Yii 2, CakePHP 2.6, Zend 1.10, and Kohana Strong knowledge of Object-Oriented PHP, Curl, Ajax, Prototype.Js, JQuery, Web services, Design Patterns, MVC architecture, and Object-Oriented Methodologies Experience with RDBMS design, normalization, Data modelling, Transactions, and distributed databases Well-versed with RDBMS MySQL (can work with other SQL flavors too) Job Type: Full-time Pay: ₹25,000.00 - ₹45,000.00 per month Work Location: In person
Posted 1 month ago
4.0 - 6.0 years
0 Lacs
Mumbai, Maharashtra, India
On-site
Job Description Position and Department Details Role: Asst.Manager (Onroll) Department: Operations Hub, Data & Capability (GIX-Intelligence). Job Role: Lean Activities: Operational Excellence: Identify and implement improvement opportunities to enhance quality, reduce latency, optimize costs, and mitigate risks, driving overall efficiency and effectiveness of output. Process Governance and Compliance: Oversee and ensure that all processes are accurately documented, up-to-date, and aligned with standard operating procedures (SOPs), guaranteeing consistency and adherence to established protocols. Process Optimization and Analysis: Conduct thorough analyses of existing processes, leveraging tools such as Value Stream Mapping (VSM) and Failure Mode Effect Analysis (FMEA) to identify areas for improvement and inform data-driven decision-making. Capability Development and Training: Design and deliver training programs for employees on Lean methodologies and tools, including Root Cause Analysis (RCA), FMEA, and other relevant techniques, to enhance skills and knowledge, and foster a culture of continuous improvement. DQM Activities: Data Quality Monitoring: Develop and implement data quality monitoring processes to identify and track data quality issues, including data validation. Data Quality Reporting: Create and maintain data quality reports to track and analyze data quality metrics, including data accuracy, completeness, and consistency. Data Quality Issue Resolution: Collaborate with stakeholders to identify and resolve data quality issues, including root cause analysis and implementation of corrective actions. Data Quality Process Development: Develop and maintain data quality processes and procedures, including data validation rules, data cleansing procedures, and data normalization standards. Stakeholder Management: Communicate data quality issues and resolutions to stakeholders, including business users, data analysts, and IT teams. Process Improvement: Continuously monitor and improve data quality processes and procedures to ensure they are efficient, effective, and aligned with business needs. Compliance: Ensure data quality processes and procedures comply with regulatory requirements, including data privacy and data security regulations. Training and Development: Provide training and development opportunities to data quality team members to ensure they have the necessary skills and knowledge to perform their jobs effectively. Special Projects: Participate in special projects, including data quality assessments, data quality audits, and data quality improvement initiatives Basic Qualification: Graduate/Masters (preferably business/commerce background) with at least 4 to 6 years of experience in lean practice. Excellent working knowledge of advanced MS Excel, MS Word and MS PowerPoint, MS outlook. Good communications skills and experience in handling senior stakeholders. Certification: Lean Six Sigma Green Belt Certification is must. Preferable: Lean Six Sigma Black Belt Certified. Expectations: The individual should be a quick learner, diligent and efficient in timely completion of tasks assigned The individual should be able to think independently, logically, and critically assess the requirement and ensure troubleshooting and solutions The individual should be able to multi-task and handle multiple activities at a time The individual should have attention to detail and should be solution oriented.
Posted 1 month ago
8.0 - 12.0 years
12 - 22 Lacs
Hyderabad
Work from Office
We are seeking a highly experienced and self-driven Senior Data Engineer to design, build, and optimize modern data pipelines and infrastructure. This role requires deep expertise in Snowflake, DBT, Python, and cloud data ecosystems. You will play a critical role in enabling data-driven decision-making across the organization by ensuring the availability, quality, and integrity of data. Key Responsibilities: Design and implement robust, scalable, and efficient data pipelines using ETL/ELT frameworks. Develop and manage data models and data warehouse architecture within Snowflake . Create and maintain DBT models for transformation, lineage tracking, and documentation. Write modular, reusable, and optimized Python scripts for data ingestion, transformation, and automation. Collaborate closely with data analysts, data scientists, and business teams to gather and fulfill data requirements. Ensure data integrity, consistency, and governance across all stages of the data lifecycle. Monitor pipeline performance and implement optimization strategies for queries and storage. Follow best practices for data engineering including version control (Git), testing, and CI/CD integration. Required Skills and Qualifications: 8+ years of experience in Data Engineering or related roles. Deep expertise in Snowflake : schema design, performance tuning, security, and access controls. Proficiency in Python , particularly for scripting, data transformation, and workflow automation. Strong understanding of data modeling techniques (e.g., star/snowflake schema, normalization). Proven experience with DBT for building modular, tested, and documented data pipelines. Familiarity with ETL/ELT tools and orchestration platforms like Apache Airflow or Prefect . Advanced SQL skills with experience handling large and complex data sets. Exposure to cloud platforms such as AWS , Azure , or GCP and their data services. Preferred Qualifications: Experience implementing data quality checks and governance frameworks. Understanding of modern data stack and CI/CD pipelines for data workflows. Contributions to data engineering best practices, open-source projects, or thought leadership.
Posted 1 month ago
8.0 - 10.0 years
0 Lacs
Gurugram, Haryana, India
On-site
As Associate Manager, Data Engineering, You Will Lead the team of Data Engineers and develop innovative approaches on performance optimization & automation Analyzing enterprise specifics to understand current-state data schema and data model and contribute to define future-state data schema, data normalization and schema integration as required by the project Apply coding expertise, best practices and guidance in Python, SQL, Informatica and cloud data platform development to members of the team Collaborate with clients to harden, scale, and parameterize code to be scalable across brands and regions Understanding business objectives and develop business intelligence applications that help to monitor & improve critical business metrics Monitor project timelines ensuring deliverables are being met by team members Communicate frequently to stakeholders on project requirements, statuses and risks Manage the monitoring of productionized processes to ensure pipelines are executed successfully every day communicating delays as required to stakeholders Contribute to the design of scalable data integration frameworks to move and transform a variety of large data sets Develop robust work products by following best practices through all stages of development, testing & deployment Skills and Qualifications BTECH / master’s degree in a quantitative field (statistics, business analytics, computer science Team management experience is must. 8-10 Years of experience (with at least 2-4 yrs of experience in managing team) Vast background in all things data related Intermediate level of proficiency with Python and data related libraries (PySpark, Pandas, etc.) High level of proficiency with SQL (Snowflake a big plus) Snowflakes is REQUIRED. We need someone with a high level of Snowflake experience. Certification is a big plus AWS data platform development experience High level of proficiency with Data Warehousing and Data Modeling Experience with ETL tools (Informatica, Talend, DataStage) required Informatica is our tool and is required. IICS or Power Center is accepted. Ability to coach team members setting them up for success in their roles Capable of connecting with team members inspiring them to be their best The Yum! Brands story is simple. We have the four distinctive, relevant and easy global brands – KFC, Pizza Hut, Taco Bell and The Habit Burger Grill -- born from the hopes and dreams, ambitions and grit of passionate entrepreneurs. And we want more of this to create our future! As the world’s largest restaurant company we have a clear and compelling mission: to build the world’s most love, trusted and fastest-growing restaurant brands. The key and not-so-secret ingredient in our recipe for growth is our unrivaled talent and culture, which fuels our results. We’re looking for talented, motivated, visionary and team-oriented leaders to join us as we elevate and personalize the customer experience across our 48,000 restaurants, operating in 145 countries and territories around the world! We put pizza, chicken and tacos in the hands of customers through customized ordering, unique delivery approaches, app experiences, and click and collect services and consumer data analytics creating unique customer dining experiences – and we are only getting started. Employees may work for a single brand and potentially grow to support all company-owned brands depending on their role. Regardless of where they work, as a company opening an average of 8 restaurants a day worldwide, the growth opportunities are endless. Taco Bell has been named of the 10 Most Innovative Companies in the World by Fast Company; Pizza Hut delivers more pizzas than any other pizza company in the world and KFC’s still use its 75-year-old finger lickin’ good recipe including secret herbs and spices to hand-bread its chicken every day. Yum! and its brands have offices in Chicago, IL, Louisville KY, Irvine, CA, Plano, TX and other markets around the world. We don’t just say we are a great place to work – our commitments to the world and our employees show it. Yum! has been named to the Dow Jones Sustainability North America Index and ranked among the top 100 Best Corporate Citizens by Corporate Responsibility Magazine in addition to being named to the Bloomberg Gender-Equality Index. Our employees work in an environment where the value of “believe in all people” is lived every day, enjoying benefits including but not limited to: 4 weeks’ vacation PLUS holidays, sick leave and 2 paid days to volunteer at the cause of their choice and a dollar-for-dollar matching gift program; generous parental leave; competitive benefits including medical, dental, vision and life insurance as well as a 6% 401k match – all encompassed in Yum!’s world-famous recognition culture.
Posted 1 month ago
7.5 years
0 Lacs
Coimbatore, Tamil Nadu, India
On-site
Project Role : Full Stack Engineer Project Role Description : Responsible for developing and/or engineering the end-to-end features of a system, from user experience to backend code. Use development skills to deliver innovative solutions that help our clients improve the services they provide. Leverage new technologies that can be applied to solve challenging business problems with a cloud first and agile mindset. Must have skills : Java Full Stack Development, Node.js Good to have skills : NA Minimum 7.5 Year(s) Of Experience Is Required Educational Qualification : BE Summary: As a Full Stack Engineer, you will be responsible for developing and/or engineering the end-to-end features of a system, from user experience to backend code. You will use your development skills to deliver innovative solutions that help our clients improve the services they provide. Additionally, you will leverage new technologies to solve challenging business problems with a cloud-first and agile mindset. Roles & Responsibilities: - Expected to be an SME, collaborate and manage the team to perform. - Responsible for team decisions. - Engage with multiple teams and contribute on key decisions. - Provide solutions to problems for their immediate team and across multiple teams. - Develop and engineer end-to-end features of a system. - Deliver innovative solutions to improve client services. - Utilize development skills to solve challenging business problems. - Stay updated with new technologies and apply them to projects. Professional & Technical Skills: - Must To Have Skills: Proficiency in Java Full Stack Development, Apache Kafka. - Strong understanding of statistical analysis and machine learning algorithms. - Experience with data visualization tools such as Tableau or Power BI. - Hands-on implementing various machine learning algorithms such as linear regression, logistic regression, decision trees, and clustering algorithms. - Solid grasp of data munging techniques, including data cleaning, transformation, and normalization to ensure data quality and integrity. Additional Information: - The candidate should have a minimum of 7.5 years of experience in Java Full Stack Development. - This position is based at our Bengaluru office. - A BE degree is required.
Posted 1 month ago
0 years
0 Lacs
Noida, Uttar Pradesh, India
On-site
Line of Service Advisory Industry/Sector Not Applicable Specialism SAP Management Level Manager Job Description & Summary A career within Data and Analytics services will provide you with the opportunity to help organisations uncover enterprise insights and drive business results using smarter data analytics. We focus on a collection of organisational technology capabilities, including business intelligence, data management, and data assurance that help our clients drive innovation, growth, and change within their organisations in order to keep up with the changing nature of customers and technology. We make impactful decisions by mixing mind and machine to leverage data, understand and navigate risk, and help our clients gain a competitive edge. Creating business intelligence from data requires an understanding of the business, the data, and the technology used to store and analyse that data. Using our Rapid Business Intelligence Solutions, data visualisation and integrated reporting dashboards, we can deliver agile, highly interactive reporting and analytics that help our clients to more effectively run their business and understand what business questions can be answered and how to unlock the answers. Why PWC At PwC, you will be part of a vibrant community of solvers that leads with trust and creates distinctive outcomes for our clients and communities. This purpose-led and values-driven work, powered by technology in an environment that drives innovation, will enable you to make a tangible impact in the real world. We reward your contributions, support your wellbeing, and offer inclusive benefits, flexibility programmes and mentorship that will help you thrive in work and life. Together, we grow, learn, care, collaborate, and create a future of infinite experiences for each other. Learn more about us. At PwC, we believe in providing equal employment opportunities, without any discrimination on the grounds of gender, ethnic background, age, disability, marital status, sexual orientation, pregnancy, gender identity or expression, religion or other beliefs, perceived differences and status protected by law. We strive to create an environment where each one of our people can bring their true selves and contribute to their personal growth and the firm’s growth. To enable this, we have zero tolerance for any discrimination and harassment based on the above considerations. " Job Description & Summary: A career within…. A career within Data and Analytics services will provide you with the opportunity to help organisations uncover enterprise insights and drive business results using smarter data analytics. We focus on a collection of organisational technology capabilities, including business intelligence, data management, and data assurance that help our clients drive innovation, growth, and change within their organisations in order to keep up with the changing nature of customers and technology. We make impactful decisions by mixing mind and machine to leverage data, understand and navigate risk, and help our clients gain a competitive edge. Responsibilities - Experience in designing and implementing the ELT architecture to build data warehouse including source-to-staging, staging-to-target mapping design - Experience in Configuring Master Repository, Work Repository, Projects, Models, Sources, Targets, Packages, Knowledge Modules, Mappings, Scenarios, Load plans, and Metadata. - Experience in creating database connections, physical and logical schema using the Topology Manager - Experience in creation of packages, construction of data warehouse and data marts, and synchronization using ODI - Experience in architecting data-related solutions, developing data warehouses, developing ELT/ETL jobs, Performance tuning and identifying bottlenecks in the process flow. - Experience using Dimensional Data modeling, Star Schema modeling, Snow-Flake modeling, - Experience using Normalization, Fact and Dimensions Tables, Physical and Logical Data Modeling. - Having Good Knowledge in Oracle cloud services and Database options. - Strong Oracle SQL expertise using tools such as SQL Developer - Understanding ERP modules is good to have Mandatory Skill Sets ODI, OAC Preferred Skill Sets ODI, OAC Years of experience required: 7 - 12 Education Qualification B.Tech / M.Tech / MBA / MCA Education (if blank, degree and/or field of study not specified) Degrees/Field of Study required: Master of Engineering, Bachelor of Engineering, Master of Business Administration Degrees/Field Of Study Preferred Certifications (if blank, certifications not specified) Required Skills Oracle Data Integrator (ODI) Optional Skills Accepting Feedback, Accepting Feedback, Active Listening, Analytical Reasoning, Analytical Thinking, Application Software, Business Data Analytics, Business Management, Business Technology, Business Transformation, Coaching and Feedback, Communication, Creativity, Documentation Development, Embracing Change, Emotional Regulation, Empathy, Implementation Research, Implementation Support, Implementing Technology, Inclusion, Intellectual Curiosity, Learning Agility, Optimism, Performance Assessment {+ 21 more} Desired Languages (If blank, desired languages not specified) Travel Requirements Not Specified Available for Work Visa Sponsorship? No Government Clearance Required? No Job Posting End Date
Posted 1 month ago
0.0 - 9.0 years
0 Lacs
Hyderabad, Telangana
On-site
Senior Scrum Master Hyderabad, India Information Technology 316176 Job Description About The Role: Grade Level (for internal use): 10 The Role: Senior Scrum Master The Team: The team is focused on agile product development offering insights into global capital markets and the financial services industry. This is an opportunity to be a pivotal part of our fast-growing global organization during an exciting phase in our company's evolution. The Impact: The Senior Scrum Master plays a crucial role in driving Agile transformation within the technology team. By facilitating efficient processes and fostering a culture of continuous improvement, this role directly contributes to the successful delivery of projects and enhances the overall team performance. What’s in it for you: Opportunity to lead and drive Agile transformation within a leading global organization. Engage with a dynamic team committed to delivering high-quality solutions. Access to professional development and growth opportunities within S&P Global. Work in a collaborative and innovative environment that values continuous improvement. Responsibilities and Impact: Facilitate Agile ceremonies such as sprint planning, daily stand-ups, retrospectives, and reviews. Act as a servant leader to the Agile team, guiding them towards continuous improvement and effective delivery. Manage scope changes, risks, and escalate issues as needed, coordinating testing efforts and assisting scrum teams with technical transitions. Support the team in defining and achieving sprint goals and objectives. Foster a culture of collaboration and transparency within the team and across stakeholders. Encourage and support the development of team members, mentoring them in Agile best practices. Conduct data analysis and create and interpret metrics for team performance tracking and improvement. Conduct business analysis and requirement gathering sessions to align database solutions with stakeholder needs. Collaborate with stakeholders to help translate business requirements into technical specifications. Ensure adherence to Agile best practices and participate in Scrum events. Lead initiatives to improve team efficiency and effectiveness in project delivery. What We’re Looking For: Basic Required Qualifications: Bachelor's degree in a relevant field or equivalent work experience. Minimum of 5-9 years of experience in a Scrum Master role, preferably within a technology team. Strong understanding of Agile methodologies, particularly Scrum and Kanban. Excellent communication and interpersonal skills. Proficiency in business analysis: Experience in gathering and analyzing business requirements, translating them into technical specifications, and collaborating with stakeholders to ensure alignment between business needs and database solutions. Requirement gathering expertise: Ability to conduct stakeholder interviews, workshops, and requirements gathering sessions to elicit, prioritize, and document business requirements related to database functionality and performance. Basic understanding of SQL queries: Ability to comprehend and analyze existing SQL queries to identify areas for performance improvement. Fundamental understanding of database structure: Awareness of database concepts including normalization, indexing, and schema design to assess query performance. Additional Preferred Qualifications: Certified Scrum Master (CSM) or similar Agile certification. Experience with Agile tools such as Azure DevOps, JIRA, or Trello. Proven ability to lead and influence teams in a dynamic environment. Familiarity with software development lifecycle (SDLC) and cloud platforms like AWS, Azure, or Google Cloud. Experience in project management and stakeholder engagement. Experience leveraging AI tools to support requirements elicitation, user story creation and refinement, agile event facilitation, and continuous improvement through data-driven insights. About S&P Global Market Intelligence At S&P Global Market Intelligence, a division of S&P Global we understand the importance of accurate, deep and insightful information. Our team of experts delivers unrivaled insights and leading data and technology solutions, partnering with customers to expand their perspective, operate with confidence, and make decisions with conviction. For more information, visit www.spglobal.com/marketintelligence. What’s In It For You? Our Purpose: Progress is not a self-starter. It requires a catalyst to be set in motion. Information, imagination, people, technology–the right combination can unlock possibility and change the world. Our world is in transition and getting more complex by the day. We push past expected observations and seek out new levels of understanding so that we can help companies, governments and individuals make an impact on tomorrow. At S&P Global we transform data into Essential Intelligence®, pinpointing risks and opening possibilities. We Accelerate Progress. Our People: We're more than 35,000 strong worldwide—so we're able to understand nuances while having a broad perspective. Our team is driven by curiosity and a shared belief that Essential Intelligence can help build a more prosperous future for us all. From finding new ways to measure sustainability to analyzing energy transition across the supply chain to building workflow solutions that make it easy to tap into insight and apply it. We are changing the way people see things and empowering them to make an impact on the world we live in. We’re committed to a more equitable future and to helping our customers find new, sustainable ways of doing business. We’re constantly seeking new solutions that have progress in mind. Join us and help create the critical insights that truly make a difference. Our Values: Integrity, Discovery, Partnership At S&P Global, we focus on Powering Global Markets. Throughout our history, the world's leading organizations have relied on us for the Essential Intelligence they need to make confident decisions about the road ahead. We start with a foundation of integrity in all we do, bring a spirit of discovery to our work, and collaborate in close partnership with each other and our customers to achieve shared goals. Benefits: We take care of you, so you can take care of business. We care about our people. That’s why we provide everything you—and your career—need to thrive at S&P Global. Our benefits include: Health & Wellness: Health care coverage designed for the mind and body. Flexible Downtime: Generous time off helps keep you energized for your time on. Continuous Learning: Access a wealth of resources to grow your career and learn valuable new skills. Invest in Your Future: Secure your financial future through competitive pay, retirement planning, a continuing education program with a company-matched student loan contribution, and financial wellness programs. Family Friendly Perks: It’s not just about you. S&P Global has perks for your partners and little ones, too, with some best-in class benefits for families. Beyond the Basics: From retail discounts to referral incentive awards—small perks can make a big difference. For more information on benefits by country visit: https://spgbenefits.com/benefit-summaries Global Hiring and Opportunity at S&P Global: At S&P Global, we are committed to fostering a connected and engaged workplace where all individuals have access to opportunities based on their skills, experience, and contributions. Our hiring practices emphasize fairness, transparency, and merit, ensuring that we attract and retain top talent. By valuing different perspectives and promoting a culture of respect and collaboration, we drive innovation and power global markets. Recruitment Fraud Alert: If you receive an email from a spglobalind.com domain or any other regionally based domains, it is a scam and should be reported to reportfraud@spglobal.com. S&P Global never requires any candidate to pay money for job applications, interviews, offer letters, “pre-employment training” or for equipment/delivery of equipment. Stay informed and protect yourself from recruitment fraud by reviewing our guidelines, fraudulent domains, and how to report suspicious activity here. - Equal Opportunity Employer S&P Global is an equal opportunity employer and all qualified candidates will receive consideration for employment without regard to race/ethnicity, color, religion, sex, sexual orientation, gender identity, national origin, age, disability, marital status, military veteran status, unemployment status, or any other status protected by law. Only electronic job submissions will be considered for employment. If you need an accommodation during the application process due to a disability, please send an email to: EEO.Compliance@spglobal.com and your request will be forwarded to the appropriate person. US Candidates Only: The EEO is the Law Poster http://www.dol.gov/ofccp/regs/compliance/posters/pdf/eeopost.pdf describes discrimination protections under federal law. Pay Transparency Nondiscrimination Provision - https://www.dol.gov/sites/dolgov/files/ofccp/pdf/pay-transp_%20English_formattedESQA508c.pdf - IFTECH202.1 - Middle Professional Tier I (EEO Job Group) Job ID: 316176 Posted On: 2025-06-25 Location: Hyderabad, Telangana, India
Posted 1 month ago
Upload Resume
Drag or click to upload
Your data is secure with us, protected by advanced encryption.
Browse through a variety of job opportunities tailored to your skills and preferences. Filter by location, experience, salary, and more to find your perfect fit.
We have sent an OTP to your contact. Please enter it below to verify.
Accenture
39817 Jobs | Dublin
Wipro
19388 Jobs | Bengaluru
Accenture in India
15458 Jobs | Dublin 2
EY
14907 Jobs | London
Uplers
11185 Jobs | Ahmedabad
Amazon
10459 Jobs | Seattle,WA
IBM
9256 Jobs | Armonk
Oracle
9226 Jobs | Redwood City
Accenture services Pvt Ltd
7971 Jobs |
Capgemini
7704 Jobs | Paris,France