Home
Jobs

751 Schema Jobs - Page 10

Filter Interviews
Min: 0 years
Max: 25 years
Min: ₹0
Max: ₹10000000
Setup a job Alert
Filter
JobPe aggregates results for easy application access, but you actually apply on the job portal directly.

6.0 - 8.0 years

1 - 4 Lacs

Chennai

Work from Office

Naukri logo

Job Title:Snowflake Developer Experience6-8 Years Location:Chennai - Hybrid : 3+ years of experience as a Snowflake Developer or Data Engineer. Strong knowledge of SQL, SnowSQL, and Snowflake schema design. Experience with ETL tools and data pipeline automation. Basic understanding of US healthcare data (claims, eligibility, providers, payers). Experience working with largescale datasets and cloud platforms (AWS, Azure,GCP). Familiarity with data governance, security, and compliance (HIPAA, HITECH).

Posted 3 weeks ago

Apply

3.0 - 6.0 years

6 - 10 Lacs

Pune

Work from Office

Naukri logo

Drive full-cycle development of complex product features across C++, and Python stack Design, and write test automation using Perl and scripting for validation Contribute to performance tuning, debugging, and production issue resolution across multi-threaded applications Contribute to containerization strategy using Docker, Kubernetes, and OCP Mentor junior developers and promote best coding/testing practices Work closely with product management and customer success to align technical deliverables with business goals Lead discussions on virtualization enhancements and product roadmap improvements Required education Bachelor's Degree Preferred education Bachelor's Degree Required technical and professional expertise 5 + years of extensive experience in C++, Perl, and Python, specializing in developing enterprise-grade backend systems. Deep understanding of Makefile systems, Shell scripting, GDB, and performance debugging Proven experience with multi-threading, socket programming, and LDAP directory services Hands-on in virtualization techniques and working knowledge of containerized environments (Docker, Kubernetes, OCP) Advanced understanding of Postgres DB, schema design, and optimization Comfortable working across Linux, Windows, and AIX platforms with automated testing in Perl Demonstrated experience in performance tuning and high-availability systems Preferred technical and professional experience Prior experience contributing to or maintaining LDAP directory servers or authentication/identity products In-depth understanding of virtualization, including VM orchestration and resource management Ability to lead performance optimization initiatives for high-scale systems Exposure to scalable, fault-tolerant systems and secure programming practices Contributions to open-source or internal tools for debugging or performance monitoring

Posted 3 weeks ago

Apply

2.0 - 6.0 years

12 - 16 Lacs

Bengaluru

Work from Office

Naukri logo

As Data Engineer, you will develop, maintain, evaluate and test big data solutions. You will be involved in the development of data solutions using Spark Framework with Python or Scala on Hadoop and AWS Cloud Data Platform Required education Bachelor's Degree Preferred education Master's Degree Required technical and professional expertise AWS Data Vault 2.0 development mechanism for agile data ingestion, storage and scaling Databricks for complex queries on transformation, aggregation, business logic implementation aggregation, business logic implementation AWS Redshift and Redshift spectrum, for complex queries on transformation, aggregation, business logic implementation DWH Concept on star schema, Materialize view concept. Strong SQL and data manipulation/transformation skills Preferred technical and professional experience Robust and Scalable Cloud Infrastructure End-to-End Data Engineering Pipeline Versatile Programming Capabilities

Posted 3 weeks ago

Apply

5.0 - 10.0 years

9 - 19 Lacs

Chennai, Bengaluru

Hybrid

Naukri logo

IMMEDIATE JOINERS & ALREADY SERVING NOTICE PERIOD PREFERRED IN ALL ROLES 1) Title: BFF JavaScript API Developer Position Type: Full Time/Contract Locations: Any location but Chennai/Bangalore preferable. Experience: 6+ Description: We're looking for a JavaScript-based API Developer/Engineer to help build and maintain APIs for a Backend for Frontend (BFF) application. The ideal candidate should have: Strong proficiency in JavaScript and Node.js Solid experience in designing and implementing RESTful and/or GraphQL APIs A good grasp of GraphQL schema , queries, and mutations Experience in integrating with various backend systems Ability to optimize performance and troubleshoot issues effectively Strong collaboration skills to work closely with both frontend and backend developers for smooth data integration *************************************************************************************** 2) Title: Java & Node.js Developer Position Type: Full Time Locations: Any client location with hybrid work model Experience - 7+ Years Description: Java + NodeJS skillset at India location. Requirement : Strong Java skills were necessary, along with node.js ***************************************************************************************** 3) Title: Senior Backend Developer Position Type: Full Time Locations: Any location but Chennai/Bangalore preferable. Experience : 6+ Description: Seeking a senior-level developer with strong experience in backend development, especially in building and supporting highly available, resilient, and performant systems. Should be a self-starter who can quickly get familiar with our codebase and begin contributing within a few days of onboarding, with the following technical skills: Strong proficiency in Java (8 or above), with a solid understanding of functional and asynchronous APIs Hands-on experience with Scala 2, and good understanding of reactive programming concepts Familiarity with Akka and Akka Streams Experience with messaging frameworks like Kafka, including handling advanced integration use cases Solid understanding of AWS services including DynamoDB, Neptune, Lambda, API Gateway, EC2, ECS Comfortable with DevOps and observability using CloudWatch logs, metrics, and traces

Posted 3 weeks ago

Apply

7.0 - 9.0 years

10 - 12 Lacs

Mumbai, Delhi / NCR, Bengaluru

Work from Office

Naukri logo

Knowledge on Solr search, schema for various data elements and prior experience in customize/implementations. Design, develop and improve open-source search APIs and SDKs Good understanding on Solr: Stemming, NLP, Result Grouping, Nested Fields, Ranking algorithms. Good knowledge about B2B commerce application and commerce objectives. Location- Remote, Delhi NCR, Bangalore, Chennai, Pune, Kolkata, Ahmedabad, Mumbai, Hyderabad

Posted 3 weeks ago

Apply

3.0 - 6.0 years

9 - 14 Lacs

Pune

Work from Office

Naukri logo

Some careers shine brighter than others. If you re looking for a career that will help you stand out, join HSBC and fulfil your potential. Whether you want a career that could take you to the top, or simply take you in an exciting new direction, HSBC offers opportunities, support and rewards that will take you further. HSBC is one of the largest banking and financial services organisations in the world, with operations in 64 countries and territories. We aim to be where the growth is, enabling businesses to thrive and economies to prosper, and, ultimately, helping people to fulfil their hopes and realise their ambitions. We are currently seeking an experienced professional to join our team in the role of Senior Consultant Specialist In this role, you will: We are seeking a highly skilled and experienced Senior Data Engineer with 10+ years experience in Python , Apache Flink, Apache Beam, MongoDB, and Google Cloud Platform (GCP) services such as Dataflow, Big Query, Pub/Sub, Google Cloud Storage (GCS), and Composer. The ideal candidate should also have hands-on experience with Apache Airflow, Google Kubernetes Engine (GKE), and Python for scripting and automation. You will play a critical role in designing, developing, and maintaining scalable, high-performance data pipelines and cloud-native solutions, with a strong focus on real-time stream processing using Apache Flink. Design, develop, and maintain real-time and batch data pipelines using Apache Flink and Apache Beam. Implement stateful stream processing, event-time handling, and windowing with Flink. Optimize Flink jobs for performance, scalability, and fault tolerance. Build scalable, high-performance applications using Java. Write clean, maintainable, and efficient code following best practices. Integrate Flink pipelines with external systems such as Kafka, HDFS, and NoSQL databases. Use Apache Airflow (or Composer on GCP) to orchestrate complex workflows and automate data pipeline execution. Monitor and troubleshoot Airflow DAGs to ensure smooth operations. Leverage GCP services to build and deploy cloud-native solutions: Dataflow: Design and deploy real-time and batch data processing pipelines. BigQuery: Perform data analysis and optimize queries for large datasets. Pub/Sub: Implement messaging and event-driven architectures. GCS: Manage and optimize cloud storage for data pipelines. Composer: Orchestrate workflows using Apache Airflow on GCP. Deploy and manage containerized applications on Google Kubernetes Engine (GKE). Design Kubernetes manifests and Helm charts for deploying scalable and fault-tolerant applications. Design and manage NoSQL databases using MongoDB, including schema design, indexing, and query optimization. Ensure data consistency and performance for high-throughput applications. Use Python for scripting, automation, and building utility tools. Write Python scripts to interact with APIs, process data, and manage workflows. Architect distributed systems with a focus on scalability, reliability, and performance. Design fault-tolerant systems with high availability using best practices. Work closely with cross-functional teams, including data engineers, DevOps engineers, and product managers, to deliver end-to-end solutions. Participate in code reviews, design discussions, and technical decision-making. Monitor production systems using tools like Stackdriver, Prometheus, or Grafana. Optimize resource usage and costs for GCP services and Kubernetes clusters. Requirements To be successful in this role, you should meet the following requirements: Strong proficiency in Java with experience in building scalable and high-performance applications. Basic to intermediate knowledge of Python for scripting and automation. Hands-on experience with Apache Flink for real-time stream processing and batch processing. Knowledge of Flink s state management, windowing, and event-time processing. Experience with Flink s integration with GCP services. Knowledge of Apache Beam for unified batch and stream data processing. Proficiency in Apache Airflow for building and managing workflows. Experience with Composer on GCP is a plus. Strong experience with Google Cloud Platform (GCP) services: Dataflow, BigQuery, Pub/Sub, GCS, and Composer. Familiarity with GCP IAM, networking, and cost optimization. Hands-on experience with Docker for containerization. Proficiency in deploying and managing applications on Google Kubernetes Engine (GKE). Expertise in MongoDB, including schema design, indexing, and query optimization. Familiarity with other NoSQL or relational databases is a plus. Strong problem-solving and analytical skills. Excellent communication and collaboration abilities. Ability to work in an agile environment and adapt to changing requirements. Experience with other stream processing frameworks like Apache Kafka Streams or Spark Streaming. Knowledge of other cloud platforms (AWS, Azure) is a plus. Familiarity with Helm charts for Kubernetes deployments. Experience with monitoring tools like Prometheus, Grafana, or Stackdriver. Knowledge of security best practices for cloud and Kubernetes environments. Bachelor s or Master s degree in Computer Science, Engineering, or a related field. You ll achieve more when you join HSBC. .

Posted 3 weeks ago

Apply

4.0 - 8.0 years

25 - 30 Lacs

Pune

Work from Office

Naukri logo

Some careers shine brighter than others. If you re looking for a career that will help you stand out, join HSBC and fulfil your potential. Whether you want a career that could take you to the top, or simply take you in an exciting new direction, HSBC offers opportunities, support and rewards that will take you further. HSBC is one of the largest banking and financial services organisations in the world, with operations in 64 countries and territories. We aim to be where the growth is, enabling businesses to thrive and economies to prosper, and, ultimately, helping people to fulfil their hopes and realise their ambitions. We are currently seeking an experienced professional to join our team in the role of Senior Consultant Specialist/ Consultant Specialist/Senior Software engineer/Software engineer (Based on number of years of experience and role) In this role, you will: We are seeking a highly skilled and experienced Senior Data Engineer with expertise in Java, Java 8, Microservices, Springboot 3.0.0, postgres, JPA, UI -React, Typescript, JS, Apache Flink, Apache Beam, MongoDB, and Google Cloud Platform (GCP) services such as Dataflow, Big Query, Pub/Sub, Google Cloud Storage (GCS), and Composer. The ideal candidate should also have hands-on experience with Apache Airflow, Google Kubernetes Engine (GKE), and Python for scripting and automation. You will play a critical role in designing, developing, and maintaining scalable, high-performance data pipelines and cloud-native solutions, with a strong focus on real-time stream processing using Apache Flink Design, develop, and maintain real-time and batch data pipelines using Apache Flink and Apache Beam. Implement stateful stream processing, event-time handling, and windowing with Flink. Optimize Flink jobs for performance, scalability, and fault tolerance. Build scalable, high-performance applications using Java. Write clean, maintainable, and efficient code following best practices. Integrate Flink pipelines with external systems such as Kafka, HDFS, and NoSQL databases. Use Apache Airflow (or Composer on GCP) to orchestrate complex workflows and automate data pipeline execution. Monitor and troubleshoot Airflow DAGs to ensure smooth operations. Leverage GCP services to build and deploy cloud-native solutions: Dataflow: Design and deploy real-time and batch data processing pipelines. BigQuery: Perform data analysis and optimize queries for large datasets. Pub/Sub: Implement messaging and event-driven architectures. GCS: Manage and optimize cloud storage for data pipelines. Composer: Orchestrate workflows using Apache Airflow on GCP. Deploy and manage containerized applications on Google Kubernetes Engine (GKE). Design Kubernetes manifests and Helm charts for deploying scalable and fault-tolerant applications. Design and manage NoSQL databases using MongoDB, including schema design, indexing, and query optimization. Ensure data consistency and performance for high-throughput applications. Use Python for scripting, automation, and building utility tools. Write Python scripts to interact with APIs, process data, and manage workflows. Architect distributed systems with a focus on scalability, reliability, and performance. Design fault-tolerant systems with high availability using best practices. Work closely with cross-functional teams, including data engineers, DevOps engineers, and product managers, to deliver end-to-end solutions. Participate in code reviews, design discussions, and technical decision-making. Monitor production systems using tools like Stackdriver, Prometheus, or Grafana. Optimize resource usage and costs for GCP services and Kubernetes clusters. Requirements To be successful in this role, you should meet the following requirements: Strong proficiency in Java mentioned above with experience in building scalable and high-performance applications. Basic to intermediate knowledge of Python for scripting and automation. Hands-on experience with Apache Flink for real-time stream processing and batch processing. Knowledge of Flink s state management, windowing, and event-time processing. Experience with Flink s integration with GCP services. Knowledge of Apache Beam for unified batch and stream data processing. Proficiency in Apache Airflow for building and managing workflows. Experience with Composer on GCP is a plus. Cloud Platform Expertise: Strong experience with Google Cloud Platform (GCP) services: Dataflow, BigQuery, Pub/Sub, GCS, and Composer. Familiarity with GCP IAM, networking, and cost optimization. Hands-on experience with Docker for containerization. Proficiency in deploying and managing applications on Google Kubernetes Engine (GKE). Expertise in MongoDB, including schema design, indexing, and query optimization. Familiarity with other NoSQL or relational databases is a plus. Strong problem-solving and analytical skills. Excellent communication and collaboration abilities. Ability to work in an agile environment and adapt to changing requirements. Experience with other stream processing frameworks like Apache Kafka Streams or Spark Streaming. Knowledge of other cloud platforms (AWS, Azure) is a plus. Familiarity with Helm charts for Kubernetes deployments. Experience with monitoring tools like Prometheus, Grafana, or Stackdriver. Knowledge of security best practices for cloud and Kubernetes environments. You ll achieve more when you join HSBC. .

Posted 3 weeks ago

Apply

5.0 - 9.0 years

18 - 20 Lacs

Pune

Work from Office

Naukri logo

Join us as a Senior Big Data/Spark Engineer at Barclays, where you will be responsible for supporting the successful delivery of location strategy projects to plan, budget, agreed quality and governance standards. Youll spearhead the evolution of our digital landscape, driving innovation and excellence. You will harness cutting-edge technology to revolutionise our digital offerings, ensuring unparalleled customer experiences. To be successful as a Senior Big Data/Spark Engineer you should have experience with: Advanced Big Data Apache Spark Expertise: Demonstrated experience developing, optimizing, and troubleshooting data processing applications using Apache Spark. Proficiency in writing efficient SQL queries and implementing data transformation pipelines at scale. Must be able to analyze performance bottlenecks and implement optimization strategies. Scala Programming Proficiency: Strong command of Scala with emphasis on functional programming paradigms. Experience implementing type-safe, immutable, and composable code patterns for data processing applications. Ability to leverage Scalas advanced features for building robust Spark applications. Cloud DevOps Competency: Hands-on experience with AWS data services including S3, EC2, EMR, Glue, and related technologies. Proficient with modern software engineering practices including version control (Git), CI/CD pipelines, infrastructure as code, and automated testing frameworks. Problem-Solving Analytical Skills: Exceptional ability to diagnose complex issues, perform root cause analysis, and implement effective solutions. Experience with performance tuning, data quality validation, and systematic debugging of distributed data applications. Some other highly valued skills may include: Quantexa Certification: Certified experience with Quantexa platform and its data analytics capabilities will be highly regarded. Front-End Development Experience: Familiarity with Node. js and modern JavaScript frameworks for developing data visualization interfaces or dashboards. Communication Excellence: Exceptional verbal and written communication skills with the ability to translate technical concepts for diverse audiences. Experience collaborating with business stakeholders, product teams, and technical specialists. Data Architecture Knowledge: Understanding of data modeling principles, schema design, and data governance practices in distributed environments. Containerization Orchestration: Experience with Docker, Kubernetes, or similar technologies for deploying and managing data applications. You may be assessed on key critical skills relevant for success in role, such as risk and controls, change and transformation, business acumen, strategic thinking and digital and technology, as well as job-specific technical skills. This role is based out of Pune. Purpose of the role To build and maintain the systems that collect, store, process, and analyse data, such as data pipelines, data warehouses and data lakes to ensure that all data is accurate, accessible, and secure. Accountabilities Build and maintenance of data architectures pipelines that enable the transfer and processing of durable, complete and consistent data. Design and implementation of data warehoused and data lakes that manage the appropriate data volumes and velocity and adhere to the required security measures. Development of processing and analysis algorithms fit for the intended data complexity and volumes. Collaboration with data scientist to build and deploy machine learning models. Analyst Expectations To perform prescribed activities in a timely manner and to a high standard consistently driving continuous improvement. Requires in-depth technical knowledge and experience in their assigned area of expertise Thorough understanding of the underlying principles and concepts within the area of expertise They lead and supervise a team, guiding and supporting professional development, allocating work requirements and coordinating team resources. If the position has leadership responsibilities, People Leaders are expected to demonstrate a clear set of leadership behaviours to create an environment for colleagues to thrive and deliver to a consistently excellent standard. The four LEAD behaviours are: L - Listen and be authentic, E - Energise and inspire, A - Align across the enterprise, D - Develop others. OR for an individual contributor, they develop technical expertise in work area, acting as an advisor where appropriate. Will have an impact on the work of related teams within the area. Partner with other functions and business areas. Takes responsibility for end results of a team s operational processing and activities. Escalate breaches of policies / procedure appropriately. Take responsibility for embedding new policies/ procedures adopted due to risk mitigation. Advise and influence decision making within own area of expertise. Take ownership for managing risk and strengthening controls in relation to the work you own or contribute to. Deliver your work and areas of responsibility in line with relevant rules, regulation and codes of conduct. Maintain and continually build an understanding of how own sub-function integrates with function, alongside knowledge of the organisations products, services and processes within the function. Demonstrate understanding of how areas coordinate and contribute to the achievement of the objectives of the organisation sub-function. Make evaluative judgements based on the analysis of factual information, paying attention to detail. Resolve problems by identifying and selecting solutions through the application of acquired technical experience and will be guided by precedents. Guide and persuade team members and communicate complex / sensitive information. Act as contact point for stakeholders outside of the immediate function, while building a network of contacts outside team and external to the organisation.

Posted 3 weeks ago

Apply

4.0 - 6.0 years

4 - 8 Lacs

Gurugram

Work from Office

Naukri logo

Responsibilities - Responsible for mentoring a team of 4-6 Junior engineers. - Participate in all agile rituals including daily stand-ups, iteration planning, story huddles, retrospectives, and creation of burn-up charts. - Ensure that all work is captured in Jira. - Supports and develops software developers by providing advice, coaching, and educational opportunities. Requirements : - B. Eng/ B. Tech from an engineering stream from a good engineering institute. - Strong proficiency with JavaScript and other related technologies - Basic understanding of front-end technologies, such as HTML5 and CSS3 - Hands-on experience with Node.js and frameworks available for it such as Express - Understanding the nature of asynchronous programming and its quirks and workarounds - Good understanding of server-side templating languages - Good understanding of server-side CSS pre-processors - Knowledge of Databases such as MySQL - Integration of multiple data sources and databases into one system - Understanding of messaging systems like Kafka etc. - Understanding fundamental design principles behind a scalable application - Understanding of fundamental design principles for building scalable applications. - Understanding the differences between multiple delivery platforms, such as mobile and desktop, and optimizing output accordingly. - Experience with creating database schemas that represent and support business processes. Apply Insights Follow-up Save this job for future reference Did you find something suspiciousReport Here! Hide This Job Click here to hide this job for you. You can also choose to hide all the jobs from the recruiter.

Posted 3 weeks ago

Apply

3.0 - 6.0 years

6 - 10 Lacs

Hyderabad

Work from Office

Naukri logo

About the Role : We are seeking a skilled Java Developer with 2+ years of hands-on experience to join our team. The ideal candidate will have expertise in building and maintaining scalable backend systems, with a focus on GPS tracking and IoT solutions. You will contribute to enhancing the platforms performance, integrating new features, and ensuring seamless communication with GPS devices. Key Responsibilities : - Develop, maintain, and optimize backend services using Java. - Work with Netty framework to handle high-performance network communication (TCP/UDP, HTTP, WebSocket). - Design and manage database schemas using MySQL, or PostgreSQL. - Implement and maintain GPS protocols for device communication. - Use Maven/Gradle for dependency management and build automation. - Optimize server performance for scalability and low-latency processing of GPS data. - Troubleshoot and debug issues across the stack, including networking, database, and API layers. - Participate in code reviews and adhere to best practices for code quality, testing, and documentation. Technical Requirements : - 2+ years of experience in Core Java development. - Proficiency in Netty for asynchronous event-driven networking. - Strong understanding of relational databases (MySQL/PostgreSQL). - Experience with Maven/Gradle and build automation tools. - Familiarity with RESTful APIs and WebSocket communication. Soft Skills : - Strong problem-solving skills for debugging complex distributed systems. - Ability to work independently and collaboratively in a team environment. - Excellent communication skills for documenting and explaining technical decisions Apply Insights Follow-up Save this job for future reference Did you find something suspiciousReport Here! Hide This Job Click here to hide this job for you. You can also choose to hide all the jobs from the recruiter.

Posted 3 weeks ago

Apply

5.0 - 8.0 years

7 - 11 Lacs

Gurugram

Work from Office

Naukri logo

Role Description: As an Informatica PL/SQL Developer, you will be a key contributor to our client's data integration initiatives. You will be responsible for developing ETL processes, performing database performance tuning, and ensuring the quality and reliability of data solutions. Your experience with PostgreSQL, DBT, and cloud technologies will be highly valuable. Responsibilities : - Design, develop, and maintain ETL processes using Informatica and PL/SQL. - Implement ETL processes using DBT with Jinja and automated unit tests. - Develop and maintain data models and schemas. - Ensure adherence to best development practices. - Perform database performance tuning in PostgreSQL. - Optimize SQL queries and stored procedures. - Identify and resolve performance bottlenecks. - Integrate data from various sources, including Kafka/MQ and cloud platforms (Azure). - Ensure data consistency and accuracy across integrated systems. - Work within an agile environment, participating in all agile ceremonies. - Contribute to sprint planning, daily stand-ups, and retrospectives. - Collaborate with cross-functional teams to deliver high-quality solutions. - Troubleshoot and resolve data integration and database issues. - Provide technical support to stakeholders. - Create and maintain technical documentation for ETL processes and database designs. - Clearly articulate complex technical issues to stakeholders. Qualifications : Experience : - 5 to 8 years of experience as an Informatica PL/SQL Developer or similar role. - Hands-on experience with Data Models and DB Performance tuning in PostgreSQL. - Experience in implementing ETL processes using DBT with Jinja and automated Unit Tests. - Strong proficiency in PL/SQL and Informatica. - Experience with Kafka/MQ and cloud platforms (Azure). - Familiarity with ETL processes using DataStage is a plus. - Strong SQL skills. Apply Insights Follow-up Save this job for future reference Did you find something suspiciousReport Here! Hide This Job Click here to hide this job for you. You can also choose to hide all the jobs from the recruiter.

Posted 3 weeks ago

Apply

11.0 - 16.0 years

13 - 18 Lacs

Bengaluru

Work from Office

Naukri logo

Dreaming big is in our DNA Its who we are as a company Its our culture Its our heritage And more than ever, its our future A future where were always looking forward Always serving up new ways to meet lifes moments A future where we keep dreaming bigger We look for people with passion, talent, and curiosity, and provide them with the teammates, resources and opportunities to unleash their full potential The power we create together when we combine your strengths with ours is unstoppable Are you ready to join a team that dreams as big as you do AB InBev GCC was incorporated in 2014 as a strategic partner for Anheuser-Busch InBev The center leverages the power of data and analytics to drive growth for critical business functions such as operations, finance, people, and technology The teams are transforming Operations through Tech and Analytics, Do You Dream Big We Need You, Job Description Job Title: SRE Platform Engineer Location: Bengaluru Reporting to: Senior Engineering Manager Purpose of the role We are looking for a Site Reliability Engineer (SRE) Platform Engineer to design, build, and maintain scalable, resilient, and efficient infrastructure for our Data Platform This role focuses on developing platform solutions, improving system reliability, automating infrastructure, and enhancing developer productivity You will work closely with Software engineers, Architects, Data Engineers, DevOps, and security teams to create a highly available and performant platform, Key tasks & accountabilities Platform Engineering & Automation: Design and implement scalable, automated, and self-healing infrastructure solutions Infrastructure as Code (IaC): Develop and maintain infrastructure using Terraform Observability & Monitoring: Implement and maintain monitoring, logging, and alerting systems using Datadog, Grafana Unity Catalog: Implement and maintain Unity Catalog, Metadata, Delta Sharing and Identity access management Databases: Implement and maintain relational databases, data warehouses and NoSQL databases Power BI: Manage the entire Power BI tenant within ABI CI/CD & DevOps Practices: Optimize CI/CD pipelines using ADO, GitHub Actions, or ArgoCD to enable seamless deployments, Cloud: Architect and manage cloud-native platforms using Azure, AWS, or Google Cloud Platform (GCP) Networking: Manage and secure the data platform network by enforcing network security policies, integrating on-premises networks with cloud environments, configuring VNETs, subnets, and routing policies Disaster Recovery: Develop and maintain the Disaster Recovery environment and conduct periodic Disaster Recovery drills Resilience & Incident Management: Improve system reliability by implementing fault-tolerant designs and participating in L4 level resolution Security & Compliance: Ensure platform security by implementing best practices for cloud-based data platforms, access controls, and zone specific compliance requirements Developer Enablement: Build internal tools and frameworks to enhance the developer experience and enable self-service capabilities Qualifications, Experience, Skills Level of educational attainment required: Bachelor's or Master's degree in computer science, Information Technology, or a related field with minimum of 3 years of experience Certifications (Any one of them) Azure Developer Associate Azure DevOps Engineer Expert Azure Solutions Architect Expert Azure Data Engineer Associate Google Professional SRE Certification AWS Certified DevOps Engineer Professional) Technical Expertise: Programming Languages: Proficient in programming languages such as Bash, Powershell, Terraform, Python, Java, etc , Cloud Platforms: Expertise in Azure, AWS or GCP cloud services Infrastructure as Code (IaC): Experience with Terraform Unity Catalog: Deep understanding of Databricks architecture, Schema & table structure, Metadata, Delta Sharing and Identity access management Databases: Deep understanding of database concepts and experience with relational databases, datawarehouses and NoSQL databases Kubernetes & Containers: Hands-on experience with Kubernetes, Helm, and Docker in production environments Power BI: Deep understanding of Power BI administration, workspace management, dashboard development, performance optimization and integration Monitoring & Logging: Experience with observability tools like Datadog, Grafana CI/CD & DevOps: Experience with GitHub, Azure DevOps, GitHub Actions, or ArgoCD Networking & Security: Experience with cloud network, firewalls, VPNs, DNS, policy deployment and vulnerability remediations Disaster Recovery: Deep understanding of cloud DR concepts and high availability requirements And above all of this, an undying love for beer! We dream big to create future with more cheers,

Posted 3 weeks ago

Apply

6.0 - 11.0 years

8 - 13 Lacs

Bengaluru

Work from Office

Naukri logo

Job Description Job Title: SRE Platform Engineer Location: Bengaluru Reporting to: Senior Engineering Manager Purpose of the role We are looking for a Site Reliability Engineer (SRE) Platform Engineer to design, build, and maintain scalable, resilient, and efficient infrastructure for our Data Platform This role focuses on developing platform solutions, improving system reliability, automating infrastructure, and enhancing developer productivity You will work closely with Software engineers, Architects, Data Engineers, DevOps, and security teams to create a highly available and performant platform, Key tasks & accountabilities Platform Engineering & Automation: Design and implement scalable, automated, and self-healing infrastructure solutions Infrastructure as Code (IaC): Develop and maintain infrastructure using Terraform Observability & Monitoring: Implement and maintain monitoring, logging, and alerting systems using Datadog, Grafana Unity Catalog: Implement and maintain Unity Catalog, Metadata, Delta Sharing and Identity access management Databases: Implement and maintain relational databases, data warehouses and NoSQL databases Power BI: Manage the entire Power BI tenant within ABI CI/CD & DevOps Practices: Optimize CI/CD pipelines using ADO, GitHub Actions, or ArgoCD to enable seamless deployments, Cloud: Architect and manage cloud-native platforms using Azure, AWS, or Google Cloud Platform (GCP) Networking: Manage and secure the data platform network by enforcing network security policies, integrating on-premises networks with cloud environments, configuring VNETs, subnets, and routing policies Disaster Recovery: Develop and maintain the Disaster Recovery environment and conduct periodic Disaster Recovery drills Resilience & Incident Management: Improve system reliability by implementing fault-tolerant designs and participating in L4 level resolution Security & Compliance: Ensure platform security by implementing best practices for cloud-based data platforms, access controls, and zone specific compliance requirements Developer Enablement: Build internal tools and frameworks to enhance the developer experience and enable self-service capabilities Qualifications, Experience, Skills Level of educational attainment required: Bachelor's or Master's degree in computer science, Information Technology, or a related field with minimum of 3 years of experience Certifications (Any one of them) Azure Developer Associate Azure DevOps Engineer Expert Azure Solutions Architect Expert Azure Data Engineer Associate Google Professional SRE Certification AWS Certified DevOps Engineer Professional) Technical Expertise: Programming Languages: Proficient in programming languages such as Bash, Powershell, Terraform, Python, Java, etc , Cloud Platforms: Expertise in Azure, AWS or GCP cloud services Infrastructure as Code (IaC): Experience with Terraform Unity Catalog: Deep understanding of Databricks architecture, Schema & table structure, Metadata, Delta Sharing and Identity access management Databases: Deep understanding of database concepts and experience with relational databases, datawarehouses and NoSQL databases Kubernetes & Containers: Hands-on experience with Kubernetes, Helm, and Docker in production environments Power BI: Deep understanding of Power BI administration, workspace management, dashboard development, performance optimization and integration Monitoring & Logging: Experience with observability tools like Datadog, Grafana CI/CD & DevOps: Experience with GitHub, Azure DevOps, GitHub Actions, or ArgoCD Networking & Security: Experience with cloud network, firewalls, VPNs, DNS, policy deployment and vulnerability remediations Disaster Recovery: Deep understanding of cloud DR concepts and high availability requirements And above all of this, an undying love for beer! We dream big to create future with more cheers,

Posted 3 weeks ago

Apply

4.0 - 9.0 years

1 - 6 Lacs

Hyderabad

Hybrid

Naukri logo

Role Summary: This role requires significant functional expertise of SAP payroll delivery primarily for Asia Pacific region (Indian) with practical knowledge of Template processes and standards and having technical SAP ABAP expertise will be an added advantage. This role is expected to function independently based on information gathered from multiple sources including internal and external business partners as well as Country Solutions & Localization team leadership. The role is expected to manage all aspects of product solution delivery and related communications. The role is responsible for providing regular status reports regarding key issues following defined processes and guidelines to internal business partners and team leadership. REQUIRED KNOWLEDGE, SKILLS & ABILITIES: A dynamic & result-oriented professional with over 3 to 10+ years experience in SAP HCM , SAP Payroll design and development to deliver high quality results. Strong hands-on experience in SAP ERP, ABAP, SAP HR, HCM, Agile, SAP HANA. Excellent communication skills, both verbal and written form. Strong analytical and logical thinking ability. Strong hold on country Payroll with expertise in customizing country PY solutions for Global clients. Ability to design and configure PY solutions in multi-tenant environment with 100% reusability and 100% maintenance free solutions. Configuration: Strong hold PY Schema's, PCRs, generic PY functions, reusable custom PY functions (Global in nature). Configuration: Strong hold on all PY Processes (mostly global in nature and widely practiced in different industries). Configuration: Strong hold all SAP specific WT's, monthly PY splits, specific clusters & Internal tables. Customization: Hands on in preparing Functional specifications for any PY specific report, programs or interfaces. Good knowledge of Employee/Manager Self Services, workflows, SAP/Portal, connectivity and landscape. Knowledge of ABAP Programming with debugging skills. A little about us: We’re partners in transformation. We help Organizations activate ideas and solutions to take advantage of a new world of opportunity. We are a committed team working with over 6000 product-based companies across North America, Europe, and Asia-Pacific. As an industry leader in IT talent management, we work with progressive leaders to drive change. In India, TEKsystems currently has 3000+ technical consultants employed at various Fortune 500 companies across the country. Please visit www.teksystems.com for more information. Have a look at our Glassdoor review on TEKsystems: https://www.glassdoor.co.in/Reviews/TEKsystems-Chennai-Reviews-EI_IE23297.0,10_IL.11,18_IM1067.htm A US-based $4.2 billion renowned brand in IT talent management, associated with over 6000 companies around the globe. World’s largest "Technology" talent management company, serving industries like IT, Telecom, Infrastructure and Engineering. TEKsystems is a part of Allegis Group which is a $12.3 Billion US-based privately held firm. Allegis is one of the world’s largest privately held companies (source: https://www.forbes.com/companies/allegis-group/) The 6th Largest IT talent management company in the world and the 2nd Largest in the US. Closely associated to 90% of Fortune 500 companies. Every year we deploy over 80,000 employees across different parts of the world. Operations in North America, Europe, and Asia-Pacific with over 300 offices across locations. In India, TEKsystems currently has 3000+ technical consultants employed at various Fortune 500 companies across the country. For the fourth consecutive year (2014 – 2017), TEKsystems was named to Fortune magazine’s “100 best companies to work for” list https://fortune.com/best-companies/2017/teksystems/ Please visit - www.teksystems.com, www.allegisgroup.com, for more information. Happy to answer any of your queries. Please feel free to reach out to me for more information on 8639264915

Posted 3 weeks ago

Apply

5.0 - 8.0 years

20 - 25 Lacs

Kochi

Work from Office

Naukri logo

Subject matter leader in a country/region payroll compliance, legislation, and local regulations, acting as key reference point for teams internally on standard payroll service, business process, Strada Pay technology, and compliance matters. This will include pro-actively scanning upcoming changes to legislation and tax laws in their country through the Compliance Alerts team, the Global Payroll Association, and other sources. Working closely with the Product development team to guide the development roadmap, including collaborating to keep country workbooks updated, country new requirements and changes. Review and analyze current payroll procedures in order to recommend changes leading to best-practice solutions and avoid manual workarounds to the extent possible Evolve Strada Pay s competitive market offering in terms of functionality, cost effectiveness and user experience. Responsible for driving continuous improvement to Strada Pay product in the country, methodology for delivering and templates/standards. Give subject matter expertise and advice to Strada Pay project teams working on deployments of the payroll service to each account and ensuring effective transition into operation. This may include supporting implementation teams with training / knowledge about Strada Payroll standards and functionality and working with implementation colleagues to align clients to Strada Payroll Standards. Support solutioning / acting as gatekeeper to Client customization requests. You will also act as the go-to person for areas of subject matter expertise, including pre-sales support/queries, able to showcase to Clients the Payroll capabilities, discuss do s and don ts, successes and pitfalls, lessons learned, best practices and standards are refined / updated. Actively contribute to Country Champion network across countries/regions. General requirement Expert knowledge of processes, policies and regulations within the area of Australian Payroll 3 years Degree/Diploma 5-8 years relevant experience of Australian Payroll, HR Outsourcing in a corporate environment. Strong knowledge of MS Office tools such as Excel, Word, and PowerPoint Flexibility to support a global and fast paced environment. Attention to detail. Excellent written and verbal skills Self-motivated and a willingness to learn. Ability to lead and manage a team. Ability to collaborate and work in a team environment, as we'll as independently while adhering to processes and procedures.

Posted 3 weeks ago

Apply

6.0 - 8.0 years

20 - 25 Lacs

Noida

Work from Office

Naukri logo

Inspect and comprehend current ETL processes written in Talend. Re-engineer and implement ETL processes in Python based on best practices for performance and ease of maintenance. Work together with data engineers and DBAs to synchronize Python ETL jobs with database schema and mappings. Create strong, scalable, and reusable code for data extraction, transformation, and loading. Maximize data processing and guarantee data quality and integrity across systems. Document new ETL processes and assist during test and production rollout. Proficient in Python programming for data processing, automation, and application development. Strong knowledge of Pandas, NumPy, and other Python libraries for data manipulation and analysis. Solid understanding of data workflows, including data cleaning, transformation, and integration. Proficient in SQL with solid knowledge of relational databases (e.g., PostgreSQL, MySQL, SQL Server). Ability to read, analyze, and refactor existing code and workflows for maintainability and performance. Preferred Skills: Experience with Apache Spark or PySpark . Familiarity with Python REST frameworks (e.g., Flask, Django, FastAPI) for building APIs and web services. Exposure to data lake or cloud-based data platforms (AWS, Azure, GCP). Knowledge of data warehousing and performance tuning. B.E/ B.Tech/ MCA

Posted 3 weeks ago

Apply

4.0 - 9.0 years

22 - 25 Lacs

Bengaluru

Work from Office

Naukri logo

We look primarily for people who are passionate about solving business problems through innovation and engineering practices. You will be required to apply your depth of knowledge and expertise to all aspects of the software development lifecycle, as we'll as partner continuously with your many stakeholders daily to stay focused on common goals. We embrace a culture of experimentation and constantly strive for improvement and learning. We welcome diverse perspectives and people who are not afraid to challenge assumptions. Our mission is to accelerate the pace of financial innovation and build new financial products for American Express. Our platform streamlines the process of launching and iterating financial products. Responsibilities: Develops and tests software, including ongoing refactoring of code & drives continuous improvement in code structure & quality. Functions as a core member of an Agile team driving User story analysis & elaboration, design and development of software applications, testing & builds automation tools. Designs, codes, tests, maintains , and documents data applications. Takes part in reviews of own work and reviews of colleagues work. Defines test conditions based on the requirements and specifications provided. Partner with the product teams to understand business data requirements, identify data needs and data sources to create data architecture Documents data requirements / data stories and maintains data models to ensure flawless integration into existing data architectures Leads multiple tasks effectively - progresses work in parallel Adapts to change quickly and easily Handles problems and acts on own initiative without being prompted Must have demonstrated proficiency and experience in the following tools and technologies: Python Object Oriented Programming Python Built in libraries: JSON, Base64, logging, os , etc Python: Poetry and dependency management Asynchronous Reactive Micro services utilizing Fast API Firm foundational understanding of Distributed Storage and Distributed Compute Py spark framework: DataFrames (Aggregation, Windowing techniques), Spark SQL Cornerstone Data Ingestion Process, Cornerstone Business Metadata management , Interactive Analytics using YellowBrick , Hyperdrive JSON schema development , CStreams Realtime event ingestion pipeline using Kafka, Event Engine Management Test Driven Development Must have Banking Domain Knowledge: Money Movement, Zelle, ACH, Intraday Working Knowledge of following tools and technologies: Data Governance Toolset: Collibra, Manta REST APIs Specifications using Swagger Development Tool Central, XLR, Jenkins Docker Image creatio n, Containers, PODs Hydra Cloud Deployment and Troubleshooting Logging using Amex Enterprise Logging Framework Analytica l and problem-solving skills Technical fluency - ability to clearly describe tradeoffs to technical and non- technical audiences alike to help support product decisions. Highly organized with strong prioritization skills and outstanding written and verbal communication - you are great at research and documenting your learnings. A bachelors degree . We back you with benefits that support your holistic we'll-being so you can be and deliver your best. This means caring for you and your loved ones physical, financial, and mental health, as we'll as providing the flexibility you need to thrive personally and professionally: Competitive base salaries Bonus incentives Support for financial-we'll-being and retirement Comprehensive medical, dental, vision, life insurance, and disability benefits (depending on location) Flexible working model with hybrid, onsite or virtual arrangements depending on role and business need Generous paid parental leave policies (depending on your location) Free access to global on-site we'llness centers staffed with nurses and doctors (depending on location) Free and confidential counseling support through our Healthy Minds program Career development and training opportunities

Posted 3 weeks ago

Apply

8.0 - 12.0 years

10 - 15 Lacs

Pune

Work from Office

Naukri logo

Design, develop, and maintain high-performance data pipelines using GCP services such as BigQuery, Dataflow, Pub/Sub, and Composer (Airflow). Write SQL queries, dbt models, or Dataflow pipelines to transform raw data into analytics-ready datasets Develop and optimize SQL queries and data transformation scripts for data warehousing and reporting purposes Lead proof-of-concepts (POCs) and best practice implementations for modern data architecture, including data lakes and cloud-native data warehouses. Ensure data security, lineage, quality, and compliance across GCP data ecosystems through IAM, audit logging, data encryption, and schema management. Understanding of security best practices, including IAM, secrets management, and vulnerability scanning. Monitor, troubleshoot, and optimize pipeline and warehouse performance using GCP native tools such as Cloud Monitoring, Cloud Logging, and BigQuery Optimizer. Automate infrastructure provisioning, configuration, and deployment using tools like Terraform, Ansible, or Cloud Deployment Manager Building and managing CI/CD pipelines. Design, implement, and manage cloud infrastructure on Google Cloud Platform (GCP). Write clean, maintainable, and efficient code following best practices. Requirements 8 12 years of experience in data engineering, with at least 3 5 years hands-on experience specifically in Google Cloud Platform (GCP). BigQuery (data modelling, optimization, security); Advanced SQL proficiency with complex data transformation, windowing functions, and analytical querying. Ability to design and develop modular, maintainable SQL models using dbt best practices. Experience in developing high performing batch pipelines. Strong understanding of data architecture patterns: data lakes, cloud-native data warehouses, event-driven architectures. Experience with version control systems like Git and branching strategies. DevOps: Strong scripting skills (Bash, Python, etc) Proficiency in building and managing CI/CD pipelines. Exposure to monitoring and optimizing cloud resources for performance, cost, and scalability using tools like Stackdriver, Prometheus, or Grafana Exposure to deployment tools like Terraform, Ansible, or Cloud Deployment Manager Monitor system health, identify potential bottlenecks, and ensure high availability and disaster recovery processes are in place. General: Experience with Agile delivery methodologies (eg Scrum, Kanban) Demonstrable track record of dealing we'll with ambiguity, prioritizing needs, and delivering results in a dynamic environment. Conduct regular workshops, demos and stakeholder reviews to showcase data solutions and capture feedback. Excellent communication and collaboration skills. Collaborate with development teams to streamline the software delivery process and improve system reliability. Mentor and upskill junior engineers and analysts on GCP tools, participate in Continuous improvement and transformation towards Agile, DevOps, CI/CD and drivers of improved productivity. Ability to translate business objectives into data solutions with a focus on delivering measurable business value. Flexible to work in shifts and provide on-call support, owning the smooth operation of applications and systems in a production environment.

Posted 3 weeks ago

Apply

0.0 - 3.0 years

2 - 5 Lacs

Coimbatore

Work from Office

Naukri logo

The following projects will get impacted that requires some experienced technicians: HPCL 3.3K VAVE GEP Evolve Pulsar phase 1 AdBlue Electronics Reliability testing Creating Vendor permissions Maintaining a safe work environment Maintaining 5S Electronics Reliability Testing Electronic & Electrical components handling Assembling & Dismantling Electronics components skills to use test equipment, test chambers, gauges, Motors , Environment chamber testing EMI/EMC Testing ESD, Surge, EFT, RI, CRF, Voltage variation testing Shipment testing vibration testing, drop test, etc Exposure to Electronics products Computer skills like excel, word, ppt, etc. Test report creation & documentation skills SOP Creation.

Posted 3 weeks ago

Apply

6.0 - 11.0 years

5 - 9 Lacs

Bengaluru

Work from Office

Naukri logo

What you will be doing: Own backend architecture, design and implementation Design and evolve database schema to comply with business requirements Work with teams consuming backend APIs in adopting new changes and improvements Improve performance, reliability, security of the infrastructure Anticipate future needs while architecting systems Collaborate with other Loop employees including frontend engineers and product managers Collaborate with technical leadership to prioritize long term technical investments and get them on the engineering roadmap Educate engineers about new findings and technology best practices Mentor, lead, hire other engineers What we are looking for: Bachelor s/Masters degree in Computer Science Engineering or a related field of study 6+ years of relevant industry experience Expertise in any application programming language like Python, Java, Go Demonstrated ability to build and scale backend systems Strong experience building microservices; scalable and secure APIs Strong experience with CSPs like AWS, GCPStrong experience with at least one of NoSQL (Firestore, MongoDB etc), SQL (MySQL/Postgres) databases and schema design Knowledge of modern auth mechanisms, such as JSON Web Token, OAuth, etc Experience with continuous integration and continuous delivery Experience writing high quality design docs with architecture diagrams Excellent interpersonal skills Ability to do cost analysis and optimization of software systems Ability to own and lead work on complex technical problems Excitement towards our mission and a strong desire to solve problems in the healthcare domain Ability to mentor, lead, interview other engineers

Posted 3 weeks ago

Apply

5.0 - 10.0 years

1 - 2 Lacs

Kolkata

Work from Office

Naukri logo

Fusion CX seeks skilled Sr. SEO Executives to join our dynamic marketing team. This Senior SEO Executive role demands an analytical mind with a deep understanding of SEO practices to drive our organic search strategies. The ideal candidate will have extensive experience in SEO, a strong analytical background, and a portfolio demonstrating successful SEO campaigns. Job Description Senior SEO Executive Responsibilities of the Senior SEO Executive: Conduct Keyword Research: Identify and analyze high-performing keywords to guide content creation and optimize website performance. On-page Optimization: Implement SEO strategies to enhance website visibility and user experience. Off-page Optimization: Develop and execute effective link-building strategies to improve site authority. Performance Analysis: Monitor and analyze SEO performance using tools like Google Analytics, providing insights and recommendations. Content Strategy: Collaborate with the content team to develop SEO-friendly content and ensure alignment with SEO best practices. Technical SEO: Conduct technical SEO audits and schemas and implement solutions to improve site health and crawlability. Competitor Analysis: Analyze competitor strategies to identify gaps and opportunities for improvement. Reporting: Generate regular reports on SEO performance, highlighting successes, trends, and areas for improvement. Leadership : Develop and lead comprehensive SEO strategies aligned with overall business objectives. Job Requirements Senior SEO Executive Skills and qualities to thrive as a Senior SEO Executive in Kolkata: Experience: 5+ years of professional experience in SEO with a proven track record of successful SEO campaigns. Technical Skills: Proficiency with SEO tools such as Google Analytics, Google Search Console, SEMrush, and Ahrefs. Analytical Skills: Strong analytical abilities to interpret data and make data-driven decisions. Communication Skills: Excellent verbal and written English communication skills for effective collaboration and reporting. Attention to Detail: Meticulous attention to detail in implementing and monitoring SEO strategies. Time Management: Strong organizational skills to manage multiple projects and meet deadlines. Preferred Qualifications Advanced SEO Techniques: Experience with advanced SEO techniques, including schema markup, site speed optimization, and mobile-first indexing. Content Marketing: Background in content marketing and experience with content management systems (CMS). HTML/CSS Knowledge: Basic understanding of HTML and CSS for on-page SEO implementation. Industry Knowledge: Up-to-date with the latest trends and best practices in SEO and search engine algorithms. Why Join Fusion CX At Fusion CX, we empower SEO professionals with the tools, support, and growth opportunities to excel in a competitive digital landscape. Here s what we offer: Career Growth - Work with a global CX leader and advance your SEO expertise. Innovative Work Culture - Join a data-driven marketing team that values creativity and performance. Competitive Compensation - Enjoy a rewarding career with performance-based incentives. Professional Development - Access training, mentorship, and upskilling opportunities. Collaborative Team Environment - Work alongside talented SEO and marketing professionals who inspire and support each other. If you are passionate about SEO and ready to make a real impact, apply today!

Posted 3 weeks ago

Apply

3.0 - 5.0 years

4 - 8 Lacs

Bengaluru

Work from Office

Naukri logo

We are looking for Pension Systems Configuration Analysts to join our team to be a part of the continual evolution of the Compendia and Administrator systems and rollout of our software to our client base. The team is responsible for the implementation of our Compendia and Administrator application to new clients both internal and external. It is an exciting opportunity to be a part of the team that is not only responsible for the core configuration of the system. This is a great role if you already have experience in configuring and/or testing software platforms and are looking for the next step in your career. Core Duties/Responsibilities The Pensions Systems Configuration Analyst is responsible for software configuration delivery to internal and external clients. Key aspects of the role are to: Undertake an analysis of pension s requirements from an outline of user requirements or from a more detailed feasibility study. Agree the scope of work when undertaking a pension s analysis task and to provide accurate estimates to the relevant Project Manager for the work to be included. Configure, test, implement and maintain specific pension s elements across the whole suite of the Pension Systems. Configuration elements include but are not limited to: Letter and document outputs Report outputs Web Self Service functionality for employer and scheme member users Pensioner payroll parameters Interface data ingestions and output Workflow processing Continuously develop a professional, technical, and commercially aware approach to delivery of tasks. Undertake analysis on new pensions projects and/or enhancements to existing projects. Analyse changes and enhancements to client s pensions schemes by taking a view of their impact on the software as installed for the client. Skills, Knowledge & Experience A basic understanding of UK pensions arrangements would be desirable but not essential as training will be provided. Technical/Software development aptitude & ability. Experience of configuration and/or testing of large scale financial or HR software platforms and systems desirable or a related area within platform development, configuration or testing. Ability to understand basic data schema models Knowledge of PL/SQL scripts in either SQL or Oracle environment. Qualifications in either a financial services or IT environment Experience in the Pensions Industry, either as a Scheme Administrator or Business Analyst experienced in capturing requirements for Pensions IT Projects. Demonstrable software development capability. Demonstrate commercial awareness & operational efficiencies/income generation in current role. Essential Qualities Be approachable and responsive to colleagues and users and have an open-minded and constructive approach to problem solving. Demonstrate effective and probing appraisal of situations. To consult with relevant parties in resolving issues. Be self-motivated, demonstrating tenacity and objectivity in problem solving to get the job done effectively. Performance Measures Timely completion of own work / contribution to team workload Meeting targets for accuracy, quality, volume and agreed service levels Satisfactory resolution of queries Adherence to procedures and regulations Contribution to continuous improvement Ongoing development of own knowledge and skills Demonstrate willingness to contribute to team beyond own immediate tasks Quality of support given to colleagues Development of technical knowledge and skill Contribution to target achievement and team goals Successful delivery of task

Posted 3 weeks ago

Apply

3.0 - 5.0 years

3 - 7 Lacs

Bengaluru

Work from Office

Naukri logo

We are looking for Pension Systems Configuration Analysts to join our team to be a part of the continual evolution of the Compendia and Administrator systems and rollout of our software to our client base. The team is responsible for the implementation of our Compendia and Administrator application to new clients both internal and external. It is an exciting opportunity to be a part of the team that is not only responsible for the core configuration of the system. This is a great role if you already have experience in configuring and/or testing software platforms and are looking for the next step in your career. Core Duties/Responsibilities The Pensions Systems Configuration Analyst is responsible for software configuration delivery to internal and external clients. Key aspects of the role are to: Undertake an analysis of pension s requirements from an outline of user requirements or from a more detailed feasibility study. Agree the scope of work when undertaking a pension s analysis task and to provide accurate estimates to the relevant Project Manager for the work to be included. Configure, test, implement and maintain specific pension s elements across the whole suite of the Pension Systems. Configuration elements include but are not limited to: Letter and document outputs Report outputs Web Self Service functionality for employer and scheme member users Pensioner payroll parameters Interface data ingestions and output Workflow processing Continuously develop a professional, technical, and commercially aware approach to delivery of tasks. Undertake analysis on new pensions projects and/or enhancements to existing projects. Analyse changes and enhancements to client s pensions schemes by taking a view of their impact on the software as installed for the client. Skills, Knowledge & Experience A basic understanding of UK pensions arrangements would be desirable but not essential as training will be provided. Technical/Software development aptitude & ability. Experience of configuration and/or testing of large scale financial or HR software platforms and systems desirable or a related area within platform development, configuration or testing. Ability to understand basic data schema models Knowledge of PL/SQL scripts in either SQL or Oracle environment. Qualifications in either a financial services or IT environment Experience in the Pensions Industry, either as a Scheme Administrator or Business Analyst experienced in capturing requirements for Pensions IT Projects. Demonstrable software development capability. Demonstrate commercial awareness & operational efficiencies/income generation in current role. Essential Qualities Be approachable and responsive to colleagues and users and have an open-minded and constructive approach to problem solving. Demonstrate effective and probing appraisal of situations. To consult with relevant parties in resolving issues. Be self-motivated, demonstrating tenacity and objectivity in problem solving to get the job done effectively. Performance Measures Timely completion of own work / contribution to team workload Meeting targets for accuracy, quality, volume and agreed service levels Satisfactory resolution of queries Adherence to procedures and regulations Contribution to continuous improvement Ongoing development of own knowledge and skills Demonstrate willingness to contribute to team beyond own immediate tasks Quality of support given to colleagues Development of technical knowledge and skill Contribution to target achievement and team goals Successful delivery of task

Posted 3 weeks ago

Apply

8.0 - 10.0 years

11 - 16 Lacs

Noida

Work from Office

Naukri logo

Having experience in ServiceNow Development and ability to drive solutions independently. Has worked on ITSM extensively and understanding of CMDB. Also, is aware of other ServiceNow products and having practical work experience would be added advantage. Rich Experience with ServiceNow client and server-side JavaScript and the ServiceNow APIs Experience with extending the ServiceNow schema to custom applications and working on ServiceNow platform capabilities and implementation of Scoped Application Experience in managing flows and workflows of Medium to Complex in nature. Understand scripted Web-Services, such as AJAX, Business Rules, JavaScript, SOAP, REST SSO-SAML Setup and Integration of ServiceNow to Other Applications Understanding of Service Portal designing would be an added advantage. Candidate must have general development experience. System integration experience using web services and other web-based technologies such as XML, HTML, AJAX, CSS, HTTP, REST/SOAP Ability to take role of Solution Architect and deliver projects for implementation and enhancements for customers along with Project Managers. Proficient in JavaScript with understanding on ServiceNow scripting. Must have some experience working with relational databases. Candidate must be able to provide management support. Help maintain expert knowledge of ServiceNow platform and products and ensure mentorship within the team. Support the junior members and also check the performance of the junior developers. Communication with the Internal and External stakeholders for gathering of business needs. Required Certifications and Knowledge: ServiceNow - Certified System Administrator ServiceNow - ITSM preferred or any other Implementation Specialist Working in Agile Team and Scrum Framework. Preferred Certifications and Knowledge Certified Application Developer ITIL Certification Micro-certifications: Micro-Certification: Automated Test Framework Micro-Certification: Flow Designer Micro-Certification: Integration Hub Micro-Certification: Agile & Test Total Experience Expected: 08-10 years B.E./ B.Tech

Posted 3 weeks ago

Apply

1.0 - 5.0 years

6 - 10 Lacs

Surat

Work from Office

Naukri logo

Understand client requirements and translate them into actionable tasks. Take detailed notes and create project-related documents, diagrams, and requirement architecture. Oversee and coordinate development tasks within the team. Conduct and participate in client meetings, requirement discussions, and provide updates to the client and product owner. Work in late shifts as per project demands. Handle support queries and ensure timely resolution. Act as a bridge between technical teams and clients, ensuring smooth communication. Manage project timelines, risks, and dependencies effectively. Assist in the preparation of project reports and documentation. Support the team in handling eCommerce-related development projects. Requirements Experience: Minimum 1.5 years in a similar role. Educational Background: Bachelors degree in Computer Science, IT, or a related field. Technical Knowledge: Must have experience in software development and database management. Project Management Skills: Ability to document requirements, create diagrams, and manage client expectations. Communication Skills: Excellent written and spoken English (minimum 4/5 rating). Industry Knowledge: Prior experience or knowledge in the eCommerce domain is preferred. Flexibility: Willingness to work in late shifts and handle multiple responsibilities, including PM, BA, and technical support tasks. Benefits

Posted 3 weeks ago

Apply

Exploring Schema Jobs in India

Schema jobs in India are in high demand as organizations across various industries are leveraging data to make informed decisions. A schema job involves designing and implementing data schemas to organize and structure data efficiently. With the increasing importance of data-driven decision-making, professionals with schema skills are highly sought after in the job market.

Top Hiring Locations in India

  1. Bangalore
  2. Hyderabad
  3. Pune
  4. Mumbai
  5. Chennai

These cities have a thriving tech industry and are actively hiring professionals with schema expertise.

Average Salary Range

The average salary range for schema professionals in India varies based on experience levels. Entry-level positions can expect to earn around INR 4-6 lakhs per annum, while experienced professionals can earn upwards of INR 15 lakhs per annum.

Career Path

In the schema field, a typical career progression may include roles such as: - Junior Developer - Data Analyst - Database Administrator - Data Engineer - Tech Lead

With experience and expertise, professionals can advance to higher roles with increased responsibilities and leadership opportunities.

Related Skills

Alongside schema expertise, professionals in this field are often expected to have knowledge of: - SQL - Data modeling - Database management systems - ETL processes - Data warehousing

These skills complement schema knowledge and enhance job performance.

Interview Questions

  • What is a schema in the context of databases? (basic)
  • Explain the difference between a logical schema and a physical schema. (medium)
  • How do you optimize a database schema for better performance? (advanced)
  • What are the advantages of using a star schema in data warehousing? (medium)
  • Can you explain the concept of normalization in database design? (medium)
  • What is denormalization, and when would you use it? (advanced)
  • How do you handle schema changes in a production environment? (advanced)
  • What are the different types of database constraints you are familiar with? (medium)
  • Describe your experience with designing schemas for big data applications. (advanced)
  • How do you ensure data integrity in a database schema? (medium)
  • Explain the concept of indexing in database schemas. (medium)
  • What are some best practices for schema design in a distributed database environment? (advanced)
  • How do you approach data migration when changing a schema structure? (advanced)

Closing Remark

As you explore opportunities in the schema job market in India, remember to showcase your expertise and experience confidently during interviews. By preparing thoroughly and staying updated on industry trends, you can position yourself as a valuable asset to organizations seeking schema professionals. Good luck with your job search!

cta

Start Your Job Search Today

Browse through a variety of job opportunities tailored to your skills and preferences. Filter by location, experience, salary, and more to find your perfect fit.

Job Application AI Bot

Job Application AI Bot

Apply to 20+ Portals in one click

Download Now

Download the Mobile App

Instantly access job listings, apply easily, and track applications.

Featured Companies