Home
Jobs

15339 Gcp Jobs - Page 39

Filter Interviews
Min: 0 years
Max: 25 years
Min: ₹0
Max: ₹10000000
Setup a job Alert
Filter
JobPe aggregates results for easy application access, but you actually apply on the job portal directly.

5.0 years

0 Lacs

Gurgaon, Haryana, India

On-site

Linkedin logo

Job Description: Data Engineer Role Overview The Data Engineer will be responsible for ensuring the availability, quality, and transformation of claims and operational data required for model development and integration. The role demands strong data pipeline design and engineering capabilities to support a scalable forecasting and capacity planning framework. Key Responsibilities Gather and process data from multiple sources including claims systems and operational databases. Build and maintain data pipelines to support segmentation and forecasting models. Ensure data integrity, transformation, and enrichment to align with modeling requirements. Collaborate with the Data Scientist to provide model-ready datasets. Support data versioning, storage, and automation for periodic refreshes. Assist in deployment/integration of data flows into operational dashboards or planning tools. Skills & Experience 5+ years of experience in data engineering or ETL development. Proficiency in SQL, Python, and data pipeline tools (e.g., Airflow, dbt, Spark, etc.). Experience with cloud-based data platforms (e.g., Azure, AWS, GCP). Understanding of data architecture and governance best practices. Prior experience working with insurance or operations-related data is a plus. Show more Show less

Posted 2 days ago

Apply

14.0 years

0 Lacs

Chennai, Tamil Nadu, India

Remote

Linkedin logo

Role Description Job Summary We are seeking an experienced Software Architect with 14+ years of experience, including at least 4–6 years in software architecture and design. The ideal candidate will have a strong background in Java, JEE, Spring Boot, JPA/Hibernate , and RESTful services , with 7+ years of hands-on experience in the Healthcare domain . This role requires exceptional technical leadership, design thinking, and the ability to work across teams globally. Key Responsibilities Lead architectural design and development for large-scale enterprise solutions in the Healthcare domain. Drive the technical roadmap, product strategy, and technology innovation. Collaborate with stakeholders including business leads, project managers, and global development teams (onsite and offshore). Design and build reusable frameworks and components. Evaluate and recommend new tools, frameworks, and technologies. Provide architectural guidance to multiple teams and ensure alignment with business and technical goals. Conduct code and design reviews; enforce coding best practices. Mentor and train team members; promote a culture of continuous learning and improvement. Engage with account teams to conceptualize new solutions and create technology roadmaps. Solve complex technical challenges and provide guidance on performance tuning, scalability, and reliability. Prepare and maintain architecture documentation and design standards. Required Skills And Experience Bachelor’s degree in Computer Science, Computer Engineering, or a related discipline (Master’s degree preferred). 10–15 years of overall software development experience. Minimum 4–6 years in a Software Architecture role. Expert-level knowledge and hands-on experience in: Java, JEE Spring Boot JPA/Hibernate RESTful web services 8+ years of experience leading software development teams. Proven experience in designing scalable, secure enterprise applications. Strong understanding of software development lifecycle (SDLC), CI/CD, and Agile methodologies. Experience in framework development and reusable component design. Deep domain expertise in Healthcare (minimum 7+ years). Excellent problem-solving, communication, and leadership skills. Nice to Have Experience with cloud platforms (AWS, Azure, or GCP). Knowledge of microservices architecture and containerization (Docker/Kubernetes). Exposure to HL7, FHIR, or other healthcare interoperability standards. Understanding of data privacy and regulatory standards (e.g., HIPAA). What We Offer Competitive compensation and benefits Flexible work environment with remote collaboration Career growth through learning and innovation Opportunity to work on impactful Healthcare solutions Skills Java,Google Cloud Platform,Microservices Show more Show less

Posted 2 days ago

Apply

3.0 years

0 Lacs

Chennai, Tamil Nadu, India

On-site

Linkedin logo

Key Responsibilities Set up and maintain monitoring dashboards for ETL jobs using Datadog, including metrics, logs, and alerts. Monitor daily ETL workflows and proactively detect and resolve data pipeline failures or performance issues. Create Datadog Monitors for job status (success/failure), job duration, resource utilization, and error trends. Work closely with Data Engineering teams to onboard new pipelines and ensure observability best practices. Integrate Datadog with tools. Conduct root cause analysis of ETL failures and performance bottlenecks. Tune thresholds, baselines, and anomaly detection settings in Datadog to reduce false positives. Document incident handling procedures and contribute to improving overall ETL monitoring maturity. Participate in on call rotations or scheduled support windows to manage ETL health. Required Skills & Qualifications 3+ years of experience in ETL/data pipeline monitoring, preferably in a cloud or hybrid environment. Proficiency in using Datadog for metrics, logging, alerting, and dashboards. Strong understanding of ETL concepts and tools (e.g., Airflow, Informatica, Talend, AWS Glue, or dbt). Familiarity with SQL and querying large datasets. Experience working with Python, Shell scripting, or Bash for automation and log parsing. Understanding of cloud platforms (AWS/GCP/Azure) and services like S3, Redshift, BigQuery, etc. Knowledge of CI/CD and DevOps principles related to data infrastructure monitoring. Preferred Qualifications Experience with distributed tracing and APM in Datadog. Prior experience monitoring Spark, Kafka, or streaming pipelines. Familiarity with ticketing tools (e.g., Jira, ServiceNow) and incident management workflows. Show more Show less

Posted 2 days ago

Apply

3.0 years

0 Lacs

Noida, Uttar Pradesh, India

On-site

Linkedin logo

Company Overview With 80,000 customers across 150 countries, UKG is the largest U.S.-based private software company in the world. And we’re only getting started. Ready to bring your bold ideas and collaborative mindset to an organization that still has so much more to build and achieve? Read on. At UKG, you get more than just a job. You get to work with purpose. Our team of U Krewers are on a mission to inspire every organization to become a great place to work through our award-winning HR technology built for all. Here, we know that you’re more than your work. That’s why our benefits help you thrive personally and professionally, from wellness programs and tuition reimbursement to U Choose — a customizable expense reimbursement program that can be used for more than 200+ needs that best suit you and your family, from student loan repayment, to childcare, to pet insurance. Our inclusive culture, active and engaged employee resource groups, and caring leaders value every voice and support you in doing the best work of your career. If you’re passionate about our purpose — people —then we can’t wait to support whatever gives you purpose. We’re united by purpose, inspired by you. Description We are looking for a talented and experienced Sr Software Engineer to join our dynamic team. This role will provide you with the opportunity to work on cutting-edge SaaS technologies and impactful projects that are used by enterprises and users worldwide. As a Senior Software Engineer, you will be involved in the design, development, testing, deployment, and maintenance of software solutions. You will work in a collaborative environment, contributing to the technical foundation behind our flagship products and services. Skills We are seeking engineers with diverse specialties and skills to join our dynamic team to innovate and solve complex challenges. Our team is looking for strong talent with expertise in the following areas: Back End Engineer (API Development, Database Management, Security Practices, Message Queuing) DevOps Engineer (k8s, CI/CD Pipelines, Managed File Transfer, Containerizati on/Orchestration, Cloud Platforms) Good to have AI knowledge (Machine Learning Frameworks, Data Processing, Big Data Technologies, Generative AI) Responsibilities Software Development: Write clean, maintainable, and efficient code or various software applications and systems. Dev Ops Model: Understanding of working in a DevOps Model. Begin to take ownership of working with product management on requirements to design, develop, test, deploy and maintain the software in production. Design and Architecture: Participate in design reviews with peers and stakeholders Code Review: Review code developed by other developers, providing feedback adhering to industry standard best practices like coding guidelines Testing: Build testable software, define tests, participate in the testing process, automate tests using tools (e.g., Junit, Selenium) and Design Patterns leveraging the test automation pyramid as the guide. Debugging and Troubleshooting: Triage defects or customer reported issues, debug and resolve in a timely and efficient manner. Service Health and Quality: Contribute to health and quality of services and incidents, promptly identifying and escalating issues. Collaborate with the team in utilizing service health indicators and telemetry for action. Assist in conducting root cause analysis and implementing measures to prevent future recurrences. Documentation: Properly document new features, enhancements or fixes to the product, and also contribute to training materials. Minimum Qualifications Bachelor’s degree in Computer Science, Engineering, or a related technical field, or equivalent practical experience. 3+ years of professional software development experience. Proficiency in one or more programming languages such as Java, C#, .NET, Python, JavaScript. Good understanding of cloud technologies and DevOps principles. Experience with software development practices and design patterns. Familiarity with version control systems like Git GitHub and bug/work tracking systems like JIRA. Strong analytical and problem-solving skills, with a proven track record of building and shipping successful software products and services. Preferred Qualifications Experience with cloud platforms like GCP, Azure, or AWS. Experience with test automation frameworks and tools. Knowledge of agile development methodologies. Commitment to continuous learning and professional development. Good communication and interpersonal skills, with the ability to work effectively in a collaborative team environment. Where we’re going UKG is on the cusp of something truly special. Worldwide, we already hold the #1 market share position for workforce management and the #2 position for human capital management. Tens of millions of frontline workers start and end their days with our software, with billions of shifts managed annually through UKG solutions today. Yet it’s our AI-powered product portfolio designed to support customers of all sizes, industries, and geographies that will propel us into an even brighter tomorrow! UKG is proud to be an equal opportunity employer and is committed to promoting diversity and inclusion in the workplace, including the recruitment process. Disability Accommodation For individuals with disabilities that need additional assistance at any point in the application and interview process, please email UKGCareers@ukg.com Show more Show less

Posted 2 days ago

Apply

8.0 years

0 Lacs

Mumbai Metropolitan Region

On-site

Linkedin logo

Job Description Support the day-to-day operations of these GCP-based data pipelines, ensuring data governance, reliability, and performance optimization. Hands-on experience with GCP data services such as Dataflow, BigQuery, Dataproc, Pub/Sub, and real-time streaming architectures is preferred.The ideal candidate is an experienced data pipeline builder and data wrangler who enjoys optimizing data systems and building them from the ground up. This role requires a flexible working schedule, including potential weekend support for critical operations, while maintaining a 40-hour work week.The Data Engineer will support our software developers, database architects, data analysts and data scientists on data initiatives and will ensure optimal data delivery architecture is consistent throughout ongoing projects. They must be self-directed and comfortable supporting the data needs of multiple teams, systems and products. The right candidate will be excited by the prospect of optimizing or even re-designing our company’s data architecture to support our next generation of products and data initiatives. A key aspect of the MDLZ Google cloud BigQuery platform is handling the complexity of inbound data, which often does not follow a global design (e.g., variations in channel inventory, customer PoS, hierarchies, distribution, and promo plans). You will assist in ensuring the robust operation of pipelines that translate this varied inbound data into the standardized o9 global design. This also includes man '8+ years of overall industry experience and minimum of 8-10 years of experience building and deploying large scale data processing pipelines in a production environment Focus on excellence: Has practical experience of Data-Driven Approaches, Is familiar with the application of Data Security strategy, Is familiar with well know data engineering tools and platforms Technical depth and breadth : Able to build and operate Data Pipelines, Build and operate Data Storage, Has worked on big data architecture within Distributed Systems. Is familiar with Infrastructure definition and automation in this context. Is aware of adjacent technologies to the ones they have worked on. Can speak to the alternative tech choices to that made on their projects. Implementation and automation of Internal data extraction from SAP BW / HANA Implementation and automation of External data extraction from openly available internet data sources via APIs Data cleaning, curation and enrichment by using Alteryx, SQL, Python, R, PySpark, SparkR Preparing consolidated DataMart for use by Data Scientists and managing SQL Databases Exposing data via Alteryx, SQL Database for consumption in Tableau Data documentation maintenance/update Collaboration and workflow using a version control system (e.g., Git Hub) Learning ability : Is self-reflective, Has a hunger to improve, Has a keen interest to drive their own learning. Applies theoretical knowledge to practice Flexible Working Hours: This role requires the flexibility to work non-traditional hours, including providing support during off-hours or weekends for critical data pipeline job runs, deployments, or incident response, while ensuring the total work commitment remains a 40-hour week. Work with stakeholders including the Executive, Product, Data and Design teams to assist with data-related technical issues and support their data infrastructure needs.Data engineering Concepts: Experience in working with data lake, data warehouse, data mart and Implemented ETL/ELT and SCD concepts. ETL or Data integration tool: Experience in Talend is highly desirable. Analytics: Fluent with SQL, PL/SQL and have used analytics tools like Big Query for data analytics Cloud experience: Experienced in GCP services like cloud function, cloud run, data flow, data proc and big query. Data sources: Experience of working with structure data sources like SAP, BW, Flat Files, RDBMS etc. and semi structured data sources like PDF, JSON, XML etc. Programming: Understanding of OOPs concepts and hands-on experience with Python/Java for programming and scripting. Data Processing: Experience in working with any of the Data Processing Platforms like Dataflow, Databricks. Orchestration: Experience in orchestrating/scheduling data pipelines using any of the tools like Airflow and Alteryx Keep our data separated and secure across national boundaries through multiple data centers and Azure regions. Create data tools for analytics and data scientist team members that assist them in building and optimizing our product into an innovative industry leader. Work with data and analytics experts to strive for greater functionality in our data systems. Skills And Experience Rich experience in working with FMCG industry. Deep knowledge in manipulating, processing, and extracting value from datasets; + 5 years of experience in data engineering, business intelligence, data science, or related field; Proficiency with Programming Languages: SQL, Python, R Spark, PySpark, SparkR, SQL for data processing; Strong project management skills and ability to plan and prioritize work in a fast-paced environment; Experience with: MS Azure Data Factory, MS Azure Data Lake Store, SQL Database, SAP BW/ ECC / HANA, Alteryx, Tableau; Ability to think creatively, highly-driven and self-motivated; Knowledge of SAP BW for HANA (Extractors, Transformations, Modeling aDSOs, Queries, OpenHubs) No Relocation support available Business Unit Summary Headquartered in Singapore, Mondelēz International’s Asia, Middle East and Africa (AMEA) region is comprised of six business units, has more than 21,000 employees and operates in more than 27 countries including Australia, China, Indonesia, Ghana, India, Japan, Malaysia, New Zealand, Nigeria, Philippines, Saudi Arabia, South Africa, Thailand, United Arab Emirates and Vietnam. Seventy-six nationalities work across a network of more than 35 manufacturing plants, three global research and development technical centers and in offices stretching from Auckland, New Zealand to Casablanca, Morocco. Mondelēz International in the AMEA region is the proud maker of global and local iconic brands such as Oreo and belVita biscuits, Kinh Do mooncakes, Cadbury, Cadbury Dairy Milk and Milka chocolate, Halls candy, Stride gum, Tang powdered beverage and Philadelphia cheese. We are also proud to be named a Top Employer in many of our markets. Mondelēz International is an equal opportunity employer and all qualified applicants will receive consideration for employment without regard to race, color, religion, gender, sexual orientation or preference, gender identity, national origin, disability status, protected veteran status, or any other characteristic protected by law. Job Type Regular Analytics & Modelling Analytics & Data Science Show more Show less

Posted 2 days ago

Apply

8.0 years

0 Lacs

Noida, Uttar Pradesh, India

On-site

Linkedin logo

Company Overview With 80,000 customers across 150 countries, UKG is the largest U.S.-based private software company in the world. And we’re only getting started. Ready to bring your bold ideas and collaborative mindset to an organization that still has so much more to build and achieve? Read on. At UKG, you get more than just a job. You get to work with purpose. Our team of U Krewers are on a mission to inspire every organization to become a great place to work through our award-winning HR technology built for all. Here, we know that you’re more than your work. That’s why our benefits help you thrive personally and professionally, from wellness programs and tuition reimbursement to U Choose — a customizable expense reimbursement program that can be used for more than 200+ needs that best suit you and your family, from student loan repayment, to childcare, to pet insurance. Our inclusive culture, active and engaged employee resource groups, and caring leaders value every voice and support you in doing the best work of your career. If you’re passionate about our purpose — people —then we can’t wait to support whatever gives you purpose. We’re united by purpose, inspired by you. Responsibilities Technical Leadership: Provide technical leadership and direction for major projects, ensuring alignment with business goals and industry best practices. Be hands-on with code, maintaining high technical standards and actively participating in design and architecture decisions, code reviews, and helping engineers optimize their code. Ensure that high standards of performance, scalability, and reliability are maintained when architecting, designing, and developing complex software systems and applications. Ensure accountability for the team’s technical decisions and enforce engineering best practices (e.g., documentation, automation, code management, security principles, leverage CoPilot). Ensure the health and quality of services and incidents, proactively identifying and addressing issues. Utilize service health indicators and telemetry for action. Implement best practices for operational excellence. Play a pivotal role in the R.I.D.E. (Review, Inspect, Decide, Execute) framework. Understand CI/CD pipelines from build, test, to deploy phases. Team Management Lead and manage a team of software engineers, fostering a collaborative and high-performance environment. Conduct regular performance reviews, provide feedback, and support professional development. Foster a culture of service ownership and enhance team engagement. Drive succession planning and engineering efficiency, focusing on quality and developer experience through data-driven approaches. Promote a growth mindset, understanding and driving organizational change. Actively seek opportunities for team growth and cross-functional collaboration. Works and guides the team on how to operate in a DevOps Model. Taking ownership from working with product management on requirements to design, develop, test, deploy and maintain the software in production. Coaching And Development Grow and develop the team technically and with a quality mindset, providing strong and actionable feedback. Provide technical mentorship and guidance to engineers at all levels, fostering a culture of learning, collaboration, and continuous improvement, encouraging the team to experiment, learn, and iterate on processes and technologies. Stay current with emerging technologies and industry trends, advocating for their adoption where appropriate to drive innovation and productivity within the team. Execution Excellence Oversee the planning, execution, and delivery of high-impact software projects, ensuring they are completed on time and within budget. Manage team workload and capacity, setting priorities and managing risks and tradeoffs. Align team efforts with the strategic direction of the company, understanding the big picture and business needs. Demonstrate engineering excellence and service ownership, including cost and quality management of services, and effective production management. Collaborate with cross-functional teams, including product management, design, and operations, to ensure alignment and successful delivery of projects. Communicate effectively with stakeholders at all levels. Make deployment decisions with appropriate risk mitigation. Minimum Qualifications Bachelor’s degree in Computer Science, Engineering, or a related technical field, or equivalent practical experience. 8+ years of experience in software development, with 3+ years in a technical leadership role and 2+ years in a people management role. Proven track record of leading and delivering large-scale, complex software projects. Deep expertise in one or more programming languages such as C, C++, C#, .NET, Python, Java, or JavaScript. Extensive experience with software architecture and design patterns. Strong understanding of cloud technologies and DevOps principles. Excellent problem-solving skills and attention to detail. Excellent communication and leadership skills, with a demonstrated ability to influence and drive change. Preferred Qualifications Master’s degree or PhD in Computer Science, Engineering, or a related technical field. Experience with cloud platforms like Azure, AWS, or GCP. Familiarity with CI/CD pipelines and automation tools. Knowledge of agile development methodologies. Experience in a complex, matrixed organization. Demonstrated commitment to diversity and inclusion initiatives. Familiarity with developing accessible technologies. Where we’re going UKG is on the cusp of something truly special. Worldwide, we already hold the #1 market share position for workforce management and the #2 position for human capital management. Tens of millions of frontline workers start and end their days with our software, with billions of shifts managed annually through UKG solutions today. Yet it’s our AI-powered product portfolio designed to support customers of all sizes, industries, and geographies that will propel us into an even brighter tomorrow! UKG is proud to be an equal opportunity employer and is committed to promoting diversity and inclusion in the workplace, including the recruitment process. Disability Accommodation For individuals with disabilities that need additional assistance at any point in the application and interview process, please email UKGCareers@ukg.com Show more Show less

Posted 2 days ago

Apply

6.0 years

0 Lacs

Mumbai Metropolitan Region

On-site

Linkedin logo

Job Description Support the day-to-day operations of these GCP-based data pipelines, ensuring data governance, reliability, and performance optimization. Hands-on experience with GCP data services such as Dataflow, BigQuery, Dataproc, Pub/Sub, and real-time streaming architectures is preferred.The ideal candidate is an experienced data pipeline builder and data wrangler who enjoys optimizing data systems and building them from the ground up. The Data Engineer will support our software developers, database architects, data analysts and data scientists on data initiatives and will ensure optimal data delivery architecture is consistent throughout ongoing projects. They must be self-directed and comfortable supporting the data needs of multiple teams, systems and products. The right candidate will be excited by the prospect of optimizing or even re-designing our company’s data architecture to support our next generation of products and data initiatives.This role requires a flexible working schedule, including potential weekend support for critical operations, while maintaining a 40-hour work week. A key aspect of the MDLZ DataHub Google BigQuery platform is handling the complexity of inbound data, which often does not follow a global design (e.g., variations in channel inventory, customer PoS, hierarchies, distribution, and promo plans). You will assist in ensuring the robust operation of pipelines that translate this varied inbound data into the standardized o9 global design. This also includes managing pipelines for different data drivers (> 6 months vs. 0-6 months), ensuring consistent input to o9. '6+ years of overall industry experience and minimum of 6-8 years of experience building and deploying large scale data processing pipelines in a production environment Focus on excellence: Has practical experience of Data-Driven Approaches, Is familiar with the application of Data Security strategy, Is familiar with well know data engineering tools and platforms Technical depth and breadth : Able to build and operate Data Pipelines, Build and operate Data Storage, Has worked on big data architecture within Distributed Systems. Is familiar with Infrastructure definition and automation in this context. Is aware of adjacent technologies to the ones they have worked on. Can speak to the alternative tech choices to that made on their projects. Implementation and automation of Internal data extraction from SAP BW / HANA Implementation and automation of External data extraction from openly available internet data sources via APIs Data cleaning, curation and enrichment by using Alteryx, SQL, Python, R, PySpark, SparkR Preparing consolidated DataMart for use by Data Scientists and managing SQL Databases Exposing data via Alteryx, SQL Database for consumption in Tableau Data documentation maintenance/update Collaboration and workflow using a version control system (e.g., Git Hub) Learning ability : Is self-reflective, Has a hunger to improve, Has a keen interest to drive their own learning. Applies theoretical knowledge to practice Build analytics tools that utilize the data pipeline to provide actionable insights into customer acquisition, operational efficiency and other key business performance metrics. Work with stakeholders including the Executive, Product, Data and Design teams to assist with data-related technical issues and support their data infrastructure needs.Data engineering Concepts: Experience in working with data lake, data warehouse, data mart and Implemented ETL/ELT and SCD concepts. ETL or Data integration tool: Experience in Talend is highly desirable. Analytics: Fluent with SQL, PL/SQL and have used analytics tools like Big Query for data analytics Cloud experience: Experienced in GCP services like cloud function, cloud run, data flow, data proc and big query. Data sources: Experience of working with structure data sources like SAP, BW, Flat Files, RDBMS etc. and semi structured data sources like PDF, JSON, XML etc. Flexible Working Hours: This role requires the flexibility to work non-traditional hours, including providing support during off-hours or weekends for critical data pipeline job runs, deployments, or incident response, while ensuring the total work commitment remains a 40-hour week. Data Processing: Experience in working with any of the Data Processing Platforms like Dataflow, Databricks. Orchestration: Experience in orchestrating/scheduling data pipelines using any of the tools like Airflow and Alteryx Keep our data separated and secure across national boundaries through multiple data centers and Azure regions. Create data tools for analytics and data scientist team members that assist them in building and optimizing our product into an innovative industry leader. Work with data and analytics experts to strive for greater functionality in our data systems. Skills And Experience Deep knowledge in manipulating, processing, and extracting value from datasets; Atleast 2 years of FMCG/CPG industry experience. + 5 years of experience in data engineering, business intelligence, data science, or related field; Proficiency with Programming Languages: SQL, Python, R Spark, PySpark, SparkR, SQL for data processing; Strong project management skills and ability to plan and prioritize work in a fast-paced environment; Experience with: MS Azure Data Factory, MS Azure Data Lake Store, SQL Database, SAP BW/ ECC / HANA, Alteryx, Tableau; Ability to think creatively, highly-driven and self-motivated; Knowledge of SAP BW for HANA (Extractors, Transformations, Modeling aDSOs, Queries, OpenHubs) No Relocation support available Business Unit Summary Headquartered in Singapore, Mondelēz International’s Asia, Middle East and Africa (AMEA) region is comprised of six business units, has more than 21,000 employees and operates in more than 27 countries including Australia, China, Indonesia, Ghana, India, Japan, Malaysia, New Zealand, Nigeria, Philippines, Saudi Arabia, South Africa, Thailand, United Arab Emirates and Vietnam. Seventy-six nationalities work across a network of more than 35 manufacturing plants, three global research and development technical centers and in offices stretching from Auckland, New Zealand to Casablanca, Morocco. Mondelēz International in the AMEA region is the proud maker of global and local iconic brands such as Oreo and belVita biscuits, Kinh Do mooncakes, Cadbury, Cadbury Dairy Milk and Milka chocolate, Halls candy, Stride gum, Tang powdered beverage and Philadelphia cheese. We are also proud to be named a Top Employer in many of our markets. Mondelēz International is an equal opportunity employer and all qualified applicants will receive consideration for employment without regard to race, color, religion, gender, sexual orientation or preference, gender identity, national origin, disability status, protected veteran status, or any other characteristic protected by law. Job Type Regular Analytics & Modelling Analytics & Data Science Show more Show less

Posted 2 days ago

Apply

5.0 years

0 Lacs

Mumbai Metropolitan Region

On-site

Linkedin logo

Job Description Looking for a savvy Data Engineer to join team of Modeling / Architect experts. The hire will be responsible for expanding and optimizing our data and data pipeline architecture, as well as optimizing data flow and collection for cross functional teams. The ideal candidate is an experienced data pipeline builder and data wrangler who enjoys optimizing data systems and building them from the ground up. The Data Engineer will support our software developers, database architects, data analysts and data scientists on data initiatives and will ensure optimal data delivery architecture is consistent throughout ongoing projects. They must be self-directed and comfortable supporting the data needs of multiple teams, systems and products. The right candidate will be excited by the prospect of optimizing or even re-designing our company’s data architecture to support our next generation of products and data initiatives.This role requires a flexible working schedule, including potential weekend support for critical operations, while maintaining a 40-hour work week. In this role, you will assist in maintaining the MDLZ DataHub Google BigQuery data pipelines and corresponding platforms (on-prem and cloud), working closely with global teams on DataOps initiatives. The D4GV platform spans across three key GCP instances: NALA, MEU, and AMEA, supporting the global rollout of o9 across all Mondelēz BUs over the next three years 5+ years of overall industry experience and minimum of 2-4 years of experience building and deploying large scale data processing pipelines in a production environment Focus on excellence: Has practical experience of Data-Driven Approaches, Is familiar with the application of Data Security strategy, Is familiar with well know data engineering tools and platforms Technical depth and breadth : Able to build and operate Data Pipelines, Build and operate Data Storage, Has worked on big data architecture within Distributed Systems. Is familiar with Infrastructure definition and automation in this context. Is aware of adjacent technologies to the ones they have worked on. Can speak to the alternative tech choices to that made on their projects. Implementation and automation of Internal data extraction from SAP BW / HANA Implementation and automation of External data extraction from openly available internet data sources via APIs Data cleaning, curation and enrichment by using Alteryx, SQL, Python, R, PySpark, SparkR Data ingestion and management in Hadoop / Hive Preparing consolidated DataMart for use by Data Scientists and managing SQL Databases Exposing data via Alteryx, SQL Database for consumption in Tableau Data documentation maintenance/update Collaboration and workflow using a version control system (e.g., Git Hub) Learning ability : Is self-reflective, Has a hunger to improve, Has a keen interest to drive their own learning. Applies theoretical knowledge to practice Build analytics tools that utilize the data pipeline to provide actionable insights into customer acquisition, operational efficiency and other key business performance metrics. Work with stakeholders including the Executive, Product, Data and Design teams to assist with data-related technical issues and support their data infrastructure needs. Flexible Working Hours: This role requires the flexibility to work non-traditional hours, including providing support during off-hours or weekends for critical data pipeline job runs, deployments, or incident response, while ensuring the total work commitment remains a 40-hour week. Create data tools for analytics and data scientist team members that assist them in building and optimizing our product into an innovative industry leader. Work with data and analytics experts to strive for greater functionality in our data systems. Skills And Experience Deep knowledge in manipulating, processing, and extracting value from datasets; support the day-to-day operations of these GCP-based data pipelines, ensuring data governance, reliability, and performance optimization. Hands-on experience with GCP data services such as Dataflow, BigQuery, Dataproc, Pub/Sub, and real-time streaming architectures is preferred. + 5 years of experience in data engineering, business intelligence, data science, or related field; Proficiency with Programming Languages: SQL, Python, R Spark, PySpark, SparkR, SQL for data processing; Strong project management skills and ability to plan and prioritize work in a fast-paced environment; Experience with: MS Azure Data Factory, MS Azure Data Lake Store, SQL Database, SAP BW/ ECC / HANA, Alteryx, Tableau; Ability to think creatively, highly-driven and self-motivated; Knowledge of SAP BW for HANA (Extractors, Transformations, Modeling aDSOs, Queries, OpenHubs) No Relocation support available Business Unit Summary Headquartered in Singapore, Mondelēz International’s Asia, Middle East and Africa (AMEA) region is comprised of six business units, has more than 21,000 employees and operates in more than 27 countries including Australia, China, Indonesia, Ghana, India, Japan, Malaysia, New Zealand, Nigeria, Philippines, Saudi Arabia, South Africa, Thailand, United Arab Emirates and Vietnam. Seventy-six nationalities work across a network of more than 35 manufacturing plants, three global research and development technical centers and in offices stretching from Auckland, New Zealand to Casablanca, Morocco. Mondelēz International in the AMEA region is the proud maker of global and local iconic brands such as Oreo and belVita biscuits, Kinh Do mooncakes, Cadbury, Cadbury Dairy Milk and Milka chocolate, Halls candy, Stride gum, Tang powdered beverage and Philadelphia cheese. We are also proud to be named a Top Employer in many of our markets. Mondelēz International is an equal opportunity employer and all qualified applicants will receive consideration for employment without regard to race, color, religion, gender, sexual orientation or preference, gender identity, national origin, disability status, protected veteran status, or any other characteristic protected by law. Job Type Regular Analytics & Modelling Analytics & Data Science Show more Show less

Posted 2 days ago

Apply

6.0 years

0 Lacs

Mumbai, Maharashtra, India

On-site

Linkedin logo

Line of Service Advisory Industry/Sector Not Applicable Specialism Data, Analytics & AI Management Level Manager Job Description & Summary A career within Data and Analytics services will provide you with the opportunity to help organisations uncover enterprise insights and drive business results using smarter data analytics. We focus on a collection of organisational technology capabilities, including business intelligence, data management, and data assurance that help our clients drive innovation, growth, and change within their organisations in order to keep up with the changing nature of customers and technology. We make impactful decisions by mixing mind and machine to leverage data, understand and navigate risk, and help our clients gain a competitive edge. Creating business intelligence from data requires an understanding of the business, the data, and the technology used to store and analyse that data. Using our Rapid Business Intelligence Solutions, data visualisation and integrated reporting dashboards, we can deliver agile, highly interactive reporting and analytics that help our clients to more effectively run their business and understand what business questions can be answered and how to unlock the answers. Job Description: Machine Learning Analytics Manager (BFSI) We are seeking a highly skilled and experienced Machine Learning Analytics Manager with a minimum of 6 years of relevant experience, preferably in the BFSI (Banking, Financial Services, and Insurance) industry. As a Machine Learning Analytics Manager, you will play a key role in leading our data analytics team and driving innovative solutions using machine learning techniques. Responsibilities:  Team Leadership: Lead and mentor a team of data scientists, analysts, and machine learning engineers to deliver impactful data-driven insights and solutions. Provide guidance and support in problem-solving, algorithm development, and model evaluation.  Machine Learning Strategy: Collaborate with cross-functional teams to identify and prioritize business challenges that can be addressed through machine learning and data analytics. Develop a comprehensive machine learning strategy aligned with the company's objectives.  Model Development: Oversee the development of machine learning models and algorithms, including supervised and unsupervised learning methods, regression models, neural networks, and decision trees.  Data Analysis: Conduct exploratory data analysis to understand underlying patterns, trends, and opportunities in large and complex datasets related to BFSI operations.  Model Deployment: Drive the integration of machine learning models into production systems, ensuring scalability, efficiency, and accuracy in real-world applications.  Risk Analysis: Utilize machine learning techniques to assess and mitigate risks in the BFSI domain, including credit risk, fraud detection, and market risk.  Performance Monitoring: Implement monitoring mechanisms to track model performance and provide continuous improvement to ensure the models' effectiveness over time.  Collaborative Projects: Collaborate with cross-functional teams, including product managers, business analysts, and IT professionals, to design and implement data- driven solutions for various business use cases.  Industry Trends: Stay updated with the latest developments and best practices in machine learning, data analytics, and the BFSI industry, and incorporate relevant advancements into the team's projects. Requirements:  Master's degree in Computer Science, Statistics, Mathematics, or a related field.  6+ years of hands-on experience in machine learning and data analytics, with a significant focus on BFSI applications.  Proficiency in programming languages such as Pyspark, Python or R, experience working with data visualization tools, Azure, AWS, GCP (any one cloud solution)  Extensive knowledge of machine learning algorithms and techniques, including supervised and unsupervised learning, time series analysis, and natural language processing.  Strong leadership and team management skills, with the ability to drive projects and collaborate effectively with cross-functional teams and stakeholder management.  Demonstrated expertise in deploying machine learning models in real-world applications, preferably in the BFSI industry. Excellent problem-solving and analytical skills, with a data-driven approach to decision-making.  Strong communication skills, both written and verbal, to present complex technical concepts to non-technical stakeholders Mandatory Skills: Machine Learning Preferred Skills: Machine Learning Years of Experience: 8-10 Qualifications: BTech Required Skills Machine Learning Optional Skills Desired Languages (If blank, desired languages not specified) Travel Requirements Available for Work Visa Sponsorship? Government Clearance Required? Job Posting End Date Show more Show less

Posted 2 days ago

Apply

6.0 years

0 Lacs

Pune, Maharashtra, India

On-site

Linkedin logo

Position Overview Job Title: Technology Service Analyst, AS Location: Pune, India Corporate Title: AS Role Description At the heart of Deutsche Bank's client franchise, is the Corporate Bank (CB), a market leader in Cash Management, Trade Finance & Lending, Securities Services and Trust & Agency services. Focusing on the Treasurers and Finance Departments of Corporate and Commercial clients and Financial Institutions across the Globe, our Universal Expertise and Global Network allows us to offer truly integrated and effective solutions. You will be operating within Corporate Bank Production as a Production Support Engineer in Payments domain. Payments Production domain is a part of Cash Management under Deutsche Bank Corporate Banking division which supports mission critical payments processing and FX platforms for multiple business lines like High Value/Low value / Bulk / Instant / Cheques payments. Team provides 24x7 support and follows ‘follow the sun’ model to provide exceptional and timebound services to the clients. Our objective at Corporate Bank Production is to consistently strive to make production better which ensures promising End To End experience for our Corporate Clients running their daily Cash Management Business through various access channels. We also implement, encourage, and invest in building Engineering culture in our daily activities to achieve the wider objectives. Our strategy leads to attain reduced number of issues, provide faster resolution on issues, and safeguard any changes being made on our production environment, across all domains at Corporate Bank. You will be accountable to drive a culture of proactive continual improvement into the Production environment through application, user request support, troubleshooting and resolving the errors in production environment. Automation of manual work, monitoring improvements and platform hygiene. Supporting the resolution of issues and conflicts and preparing reports and meetings. Candidate should have experience in all relevant tools used in the Service Operations environment and has specialist expertise in one or more technical domains and ensures that all associated Service Operations stakeholders are provided with an optimum level of service in line with Service Level Agreements (SLAs) / Operating Level Agreements (OLAs). What We’ll Offer You As part of our flexible scheme, here are just some of the benefits that you’ll enjoy Best in class leave policy Gender neutral parental leaves 100% reimbursement under childcare assistance benefit (gender neutral) Sponsorship for Industry relevant certifications and education Employee Assistance Program for you and your family members Comprehensive Hospitalization Insurance for you and your dependents Accident and Term life Insurance Complementary Health screening for 35 yrs. and above Your Key Responsibilities Acting as a Production Support Analyst for the CB production team providing second level of support for the applications under the tribe working with key stakeholders and team members across the globe in 365 days, 24/7 working model As an individual contributor and prime liaison for the application suite into the incident, problem, change, release, capacity, and continuous improvement. Escalation, Management, and communication of major production incidents Liaising with development teams on new application handover and 3rd line escalation of issues Application rollout activities (may include some weekend activities) Manage SLO for Faster Resolution and Fewer Incident for the Production Application Stability Develop a Continuous Service Improvement approach to resolve IT failings, drive efficiencies and remove repetition to streamline support activities, reduce risk, and improve system availability by understanding emerging trends and proactively addressing them. Carry out technical analysis of the Production platform to identify and remediate performance and resiliency issues. Update the RUN Book and KEDB as and when required. Your Skills And Experience Good experience in Production Application Support and ITIL Practices Very good hands-on knowledge of databases (Oracle/PLSQL etc.), including working experience of writing SQL scripts and queries. Very Good hands-on experience on UNIX/Linux, Solaris, Java J2EE, Python, PowerShell scripts, tools for automation (RPA, Workload, Batch) Exposure in Kaka, Kubernetes and microservices is added advantage. Experience in application performance monitoring tools – Geneos, Splunk, Grafana & New Relic, Scheduling Tools (Control-M) Excellent Team player and People Management experience is an advantage. Bachelor's degree. Master's degree a plus. Previous relevant experience in Banking Domain 6+ years’ experience in IT in large corporate environments, specifically in the production support. Operating systems (e.g. UNIX, Windows) Understanding on environments Middleware (e.g.MQ, WebLogic, Tomcat, Jboss, Apache, Kafka etc ) Database environments (e.g. Oracle, MS-SQL, Sybase, No SQL) Experience in APM Tools like Splunk & Geneos; Control-M /Autosys; App dynamics. Nice to have: Cloud services: GCP Exposure on Payments domain fundamentals & SWIFT message types Knowledge in Udeploy, Bit bucket Skills That Will Help You Excel Self-motivated with excellent interpersonal, presentation, and communication skills. Able to think strategically with strong analytical and problem-solving skills. Able to handle multiple demands and priorities simultaneously, work under pressure, in an organized manner and with teams across multiple locations and time-zones. Able to connect, manage & influence people from different backgrounds and cultures. A strong team player being part of a global team, communicating, managing, and cooperating closely on a global level while being able to take ownership and deliver independently. How We’ll Support You Training and development to help you excel in your career Coaching and support from experts in your team A culture of continuous learning to aid progression A range of flexible benefits that you can tailor to suit your needs About Us And Our Teams Please visit our company website for further information: https://www.db.com/company/company.htm We strive for a culture in which we are empowered to excel together every day. This includes acting responsibly, thinking commercially, taking initiative and working collaboratively. Together we share and celebrate the successes of our people. Together we are Deutsche Bank Group. We welcome applications from all people and promote a positive, fair and inclusive work environment. Show more Show less

Posted 2 days ago

Apply

10.0 years

0 Lacs

Hyderabad, Telangana, India

On-site

Linkedin logo

CACI India, RMZ Nexity, Tower 30 4th Floor Survey No.83/1, Knowledge City Raidurg Village, Silpa Gram Craft Village, Madhapur, Serilingampalle (M), Hyderabad, Telangana 500081, India Req #1161 15 June 2025 CACI International Inc is an American multinational professional services and information technology company headquartered in Northern Virginia. CACI provides expertise and technology to enterprise and mission customers in support of national security missions and government transformation for defense, intelligence, and civilian customers. CACI has approximately 23,000 employees worldwide. Headquartered in London, CACI Ltd is a wholly owned subsidiary of CACI International Inc., a publicly listed company on the NYSE with annual revenue in excess of US $6.2bn. Founded in 2022, CACI India is an exciting, growing and progressive business unit of CACI Ltd. CACI Ltd currently has over 2000 intelligent professionals and are now adding many more from our Hyderabad and Pune offices. Through a rigorous emphasis on quality, the CACI India has grown considerably to become one of the UKs most well-respected Technology centres. Role Overview We’re looking for a hands‑on Technical Project Manager who can own the delivery of complex, cloud‑native products in a fast‑growing SaaS environment. You’ll partner with Engineering, Product, UX, DevOps and Business stakeholders to plan, execute and launch features that delight customers and scale globally. Key Responsibilities End ‑ to ‑ End Project Ownership – Define scope, timelines, deliverables and success metrics for multiple concurrent product development streams. Agile Leadership – Champion Scrum/Kanban practices; facilitate sprint planning, stand‑ups, retrospectives and demos. Cross ‑ Functional Coordination – Align Engineering, QA, UX, Product, DevOps & Security teams, ensuring shared understanding of goals and dependencies. Stakeholder Communication – Provide clear, data‑driven status updates to leadership and customers; manage expectations and negotiate trade‑offs. Risk & Issue Management – Identify technical and delivery risks early, create mitigation plans and drive resolution. Quality & Release Management – Enforce definition of done, oversee test coverage, CI/CD pipelines and production release readiness. Budget & Resource Management – Forecast and track project budgets, resource allocation and vendor engagement. Process Improvement – Analyse sprint metrics (velocity, burndown, DORA, OKRs) and implement continuous improvement initiatives. Must‑Have Qualifications 10+ years total experience in software development & delivery, with 3+ years as a Technical Project/Program Manager. Proven track record launching B2B/B2C SaaS products or cloud‑based platforms end‑to‑end. Solid foundation in software engineering (B.E./B.Tech. in CS/IT or equivalent). Expert knowledge of Agile/Scrum frameworks and tools (Jira, Azure DevOps, etc.). Working familiarity with microservices, REST APIs, CI/CD pipelines, and public cloud (AWS, Azure or GCP). Strong analytical mindset; comfortable using data to drive decisions and report progress. Exceptional written & verbal communication; able to influence technical and non‑technical audiences. Preferred Skills & Certifications PMP, PRINCE2, PMI‑ACP, CSM or equivalent agile/project management certification. Experience scaling multi‑tenant SaaS platforms, subscription billing, and usage‑based pricing models. Exposure to DevOps/SRE practices, Infrastructure as Code, and security compliance (SOC 2, ISO 27001, GDPR/DPDP). Prior success in a high‑growth startup or global scale‑up environment. More About The Opportunity The Technical Project Manager is an excellent opportunity, and CACI Services India reward their staff well with a competitive salary and impressive benefits package which includes: Learning: Budget for conferences, training courses and other materials Health Benefits: Family plan with 4 children and parents covered Future You: Matched pension and health care package We understand the importance of getting to know your colleagues. Company meetings are held every quarter, and a training/work brief weekend is held once a year, amongst many other social events. CACI is an equal opportunities employer. Therefore, we embrace diversity and are committed to a working environment where no one will be treated less favourably on the grounds of their sex, race, disability, sexual orientation religion, belief or age. We have a Diversity & Inclusion Steering Group and we always welcome new people with fresh perspectives from any background to join the group An inclusive and equitable environment enables us to draw on expertise and unique experiences and bring out the best in each other. We champion diversity, inclusion and wellbeing and we are supportive of Veterans and people from a military background. We believe that by embracing diverse experiences and backgrounds, we can collaborate to create better outcomes for our people, our customers and our society. Other details Pay Type Salary Apply Now Show more Show less

Posted 2 days ago

Apply

5.0 - 31.0 years

0 - 0 Lacs

Coimbatore

Remote

Apna logo

Job Title: Technical Support – Service Desk Location: Coimbatore Salary: Up to ₹9 LPA Work Mode: Work from Office Shift: Rotational (5 Days Working, 2 Days Off) Key Responsibilities:Provide tech support via email, chat, and phone Resolve issues in web protocols, networking, system admin (Windows/Linux), APIs, SQL, and email delivery Analyze logs and use CLI for troubleshooting Document cases accurately Handle escalations with internal teams Improve support processes and knowledge base Mentor junior staff Participate in on-call support RequirementsFamiliarity with Google Workspace (GWS) and Google Cloud Platform (GCP). Experience with BigQuery, cloud migration tools/processes. Exposure to scripting languages like Python, JavaScript, HTML. Relevant certifications are a plus: CompTIA Network+, Security+, Linux+ Microsoft Certified: Azure Administrator Associate Google Cloud Certified – Associate Cloud Engineer Required Qualifications:Bachelor's Degree in Computer Science / IT / Engineering. 5–6 years of experience in technical customer support. Strong analytical, troubleshooting, and communication skills.

Posted 2 days ago

Apply

5.0 - 31.0 years

0 - 0 Lacs

Hyderabad

Remote

Apna logo

Job Title: Technical Support – Service DeskLocation: Hyderabad Salary: Up to ₹9 LPA Work Mode: Work from Office Shift: Rotational (5 Days Working, 2 Days Off) Key Responsibilities:Provide tech support via email, chat, and phone Resolve issues in web protocols, networking, system admin (Windows/Linux), APIs, SQL, and email delivery Analyze logs and use CLI for troubleshooting Document cases accurately Handle escalations with internal teams Improve support processes and knowledge base Mentor junior staff Participate in on-call support RequirementsFamiliarity with Google Workspace (GWS) and Google Cloud Platform (GCP). Experience with BigQuery, cloud migration tools/processes. Exposure to scripting languages like Python, JavaScript, HTML. Relevant certifications are a plus: CompTIA Network+, Security+, Linux+ Microsoft Certified: Azure Administrator Associate Google Cloud Certified – Associate Cloud Engineer Required Qualifications:Bachelor's Degree in Computer Science / IT / Engineering. 5–6 years of experience in technical customer support. Strong analytical, troubleshooting, and communication skills.

Posted 2 days ago

Apply

5.0 - 31.0 years

0 - 1 Lacs

Nagole, Hyderabad

Remote

Apna logo

🔹 Job Title: Python Backend & Middleware Developer with Database Expertise 📍 Location: Hyderabad 🕒 Experience: 3–6 Years 🧾 Employment Type: Full-time 🔧 Key Responsibilities: 🔸 Python Backend Development: - Design, build, and maintain scalable RESTful APIs using Python (FastAPI/Django/Flask). - Write clean, efficient, and testable code. - Implement backend logic, data processing, and third-party API integrations. - Use asynchronous programming paradigms where required (e.g., asyncio, aiohttp). 🔸 Middleware Development: - Develop and maintain middleware components to handle cross-cutting concerns like logging, authentication, and request/response handling. - Ensure smooth communication between different systems, services, and microservices. - Optimize inter-service communication using message brokers (RabbitMQ, Kafka, etc.). - Implement caching and rate-limiting mechanisms where applicable. 🔸 Database Development: - Design and manage relational (PostgreSQL, MySQL) and NoSQL (MongoDB, Redis) databases. - Write complex SQL queries, stored procedures, and views for efficient data retrieval. - Ensure database normalization, indexing, performance tuning, and optimization. - Implement data backup, recovery strategies, and migration scripts. 🧠 Required Skills: - Strong proficiency in Python 3.x and experience with frameworks like FastAPI, Django, or Flask. - Experience with middleware architecture, API Gateways, or microservice orchestration. - Expertise in SQL and hands-on experience with PostgreSQL / MySQL. - Familiarity with NoSQL databases like MongoDB or Redis. - Knowledge of RESTful APIs, OAuth2/JWT, and API security best practices. - Hands-on experience with Docker, Git, and CI/CD pipelines. - Familiarity with cloud platforms like AWS, GCP, or Azure is a plus. - Good understanding of software design patterns and architecture principles. ✅ Preferred Qualifications: - Bachelor's/Master's degree in Computer Science, Information Technology, or related fields. - Experience working in Agile/Scrum teams. - Exposure to Kafka, RabbitMQ, or similar messaging systems. - Experience with Unit Testing, Integration Testing, and Load Testing tools. 🧩 Soft Skills: - Strong problem-solving and analytical skills. - Excellent communication and teamwork abilities. - Ability to manage time effectively and deliver tasks within deadlines.

Posted 2 days ago

Apply

5.0 years

0 Lacs

Pune/Pimpri-Chinchwad Area

On-site

Linkedin logo

Job Description Senior Site Reliability Engineer, Pune At NielsenIQ Digital Shelf , we help the world’s leading brands measure and improve their online performance. Formerly known as Data Impact, we've recently joined NielsenIQ . Today, we operate at the intersection of scale and agility — a tech-driven environment backed by a global organization. Our Infrastructure team plays a critical, cross-functional role: we build and operate the core platforms that power our applications, ensure reliability, security, and efficiency, and empower development teams to move faster and safer. As a Senior Site Reliability Engineer , you’ll help drive our infrastructure forward — designing resilient systems, optimizing performance, and automating at scale. You’ll be part of a team that takes pride in owning foundational services used across the company. Responsibilities: Design, maintain, and evolve our infrastructure, primarily on Google Cloud Platform, with components on AWS and OVH (bare metal) Improve and standardize our CI/CD pipelines, monitoring, and observability stack Develop and maintain our Infrastructure as Code using Terraform, Ansible, and custom Bash/Python scripts Manage identity and access (IAM), SSO, and contribute to a security-first infrastructure posture Automate infrastructure provisioning and environment management for dev and production teams Define and monitor SLOs, lead post-mortems, and foster a culture of continuous improvement Mentor other team members and help spread SRE best practices across the organization Qualifications 5+years of experience in SRE, DevOps, or Infrastructure Engineering roles in cloud-based environments Strong hands-on experience with GCP and/or AWS, Terraform, Ansible, and infrastructure automation Solid grasp of SRE principles: reliability, availability, incident response, automation Comfortable working with hybrid infrastructure (cloud + dedicated hardware) Familiar with infrastructure security, access management, and compliance best practices A collaborative mindset, technical leadership, and a drive to elevate engineering practices Additional Information Our Benefits Flexible working environment Volunteer time off LinkedIn Learning Employee-Assistance-Program (EAP) About NIQ NIQ is the world’s leading consumer intelligence company, delivering the most complete understanding of consumer buying behavior and revealing new pathways to growth. In 2023, NIQ combined with GfK, bringing together the two industry leaders with unparalleled global reach. With a holistic retail read and the most comprehensive consumer insights—delivered with advanced analytics through state-of-the-art platforms—NIQ delivers the Full View™. NIQ is an Advent International portfolio company with operations in 100+ markets, covering more than 90% of the world’s population. For more information, visit NIQ.com Want to keep up with our latest updates? Follow us on: LinkedIn | Instagram | Twitter | Facebook Our commitment to Diversity, Equity, and Inclusion NIQ is committed to reflecting the diversity of the clients, communities, and markets we measure within our own workforce. We exist to count everyone and are on a mission to systematically embed inclusion and diversity into all aspects of our workforce, measurement, and products. We enthusiastically invite candidates who share that mission to join us. We are proud to be an Equal Opportunity/Affirmative Action-Employer, making decisions without regard to race, color, religion, gender, gender identity or expression, sexual orientation, national origin, genetics, disability status, age, marital status, protected veteran status or any other protected class. Our global non-discrimination policy covers these protected classes in every market in which we do business worldwide. Learn more about how we are driving diversity and inclusion in everything we do by visiting the NIQ News Center: https://nielseniq.com/global/en/news-center/diversity-inclusion Show more Show less

Posted 2 days ago

Apply

14.0 - 20.0 years

50 - 70 Lacs

Bengaluru

Hybrid

Naukri logo

Overview : As an SRE manager, you are responsible for the availability and reliability of Calixs cloud. At Calix, Site Reliability Engineering combines software and systems engineering to build and run large-scale, distributed, fault-tolerant systems. You would be responsible for leading a team of Site Reliability Engineers, overseeing the reliability, scalability, and maintainability of Calix's critical infrastructure, including building and maintaining automation tools, managing on-call rotations, collaborating with development teams, and ensuring systems meet service level objectives (SLOs), all while prioritizing continuous improvement and a strong focus on infrastructure health and stability within the Calix platform, leveraging tools like Terraform, observability frameworks from the Grafana Labs ecosystem, and Google Cloud Platform. Qualifications : - Strong experience as an SRE manager with a proven track record of managing large-scale, highly available systems. - Expertise in cloud computing platforms (preferably Google Cloud Platform). - Knowledge of core operating system principles, networking fundamentals, and systems management. - Programming skills in languages like Python and Go. - Proven experience building and leading SRE teams, including hiring, coaching, and performance management. - Deep understanding and expertise in building and maintaining scalable open-source monitoring tools and backend storage. - Experience with incident management processes and best practices. - Excellent communication and collaboration skills to work with cross-functional teams. - Knowledge of SRE principles, including error budgets, fault analysis, and reliability engineering concepts. Education : - B.S. or M.S. in Computer Science or equivalent field. Role & responsibilities

Posted 2 days ago

Apply

0 years

0 Lacs

Gurugram, Haryana, India

Remote

Linkedin logo

Join a leading U.S.-based company as a Python Developer , where you’ll play a key role in driving innovative solutions in technology. Use your Python skills to tackle meaningful challenges and work on impactful projects with global experts, focusing on efficient development and problem-solving that makes a real difference. Job Responsibilities: Develop efficient Python code to address problems effectively Apply business acumen and analytical skills to extract meaningful insights from public databases Articulate reasoning and logic coherently when writing code in Jupyter notebooks or similar platforms Collaborate closely with researchers to exchange ideas and insights Maintain thorough documentation for all developed code Job Requirements: Open to applicants of all levels, from junior to industry experts Bachelor's degree in Engineering, Computer Science, or equivalent practical experience Good grasp of Python programming language for coding and debugging purposes Knowledge of databases (SQL/NoSQL) and cloud platforms (AWS, GCP, Azure) is a plus. A minimum 5-hour work overlap with PST/PT is required. Strong communication and teamwork skills in a remote setting. Perks: Work with top industry experts worldwide. This is a contractual remote work opportunity without traditional job constraints. Competitive salary aligned with global standards. Be part of cutting-edge, high-impact projects. Selection Process: Shortlisted developers may be asked to complete an assessment. If you clear the assessment, you will be contacted for contract assignments with expected start dates, durations, and end dates. Some contract assignments require fixed weekly hours, averaging 20/30/40 hours per week for the duration of the contract assignment. Show more Show less

Posted 2 days ago

Apply

0 years

0 Lacs

Chennai, Tamil Nadu, India

Remote

Linkedin logo

Join a leading U.S.-based company as a Python Developer , where you’ll play a key role in driving innovative solutions in technology. Use your Python skills to tackle meaningful challenges and work on impactful projects with global experts, focusing on efficient development and problem-solving that makes a real difference. Job Responsibilities: Develop efficient Python code to address problems effectively Apply business acumen and analytical skills to extract meaningful insights from public databases Articulate reasoning and logic coherently when writing code in Jupyter notebooks or similar platforms Collaborate closely with researchers to exchange ideas and insights Maintain thorough documentation for all developed code Job Requirements: Open to applicants of all levels, from junior to industry experts Bachelor's degree in Engineering, Computer Science, or equivalent practical experience Good grasp of Python programming language for coding and debugging purposes Knowledge of databases (SQL/NoSQL) and cloud platforms (AWS, GCP, Azure) is a plus. A minimum 5-hour work overlap with PST/PT is required. Strong communication and teamwork skills in a remote setting. Perks: Work with top industry experts worldwide. This is a contractual remote work opportunity without traditional job constraints. Competitive salary aligned with global standards. Be part of cutting-edge, high-impact projects. Selection Process: Shortlisted developers may be asked to complete an assessment. If you clear the assessment, you will be contacted for contract assignments with expected start dates, durations, and end dates. Some contract assignments require fixed weekly hours, averaging 20/30/40 hours per week for the duration of the contract assignment. Show more Show less

Posted 2 days ago

Apply

0 years

0 Lacs

Delhi, India

Remote

Linkedin logo

Join a leading U.S.-based company as a Python Developer , where you’ll play a key role in driving innovative solutions in technology. Use your Python skills to tackle meaningful challenges and work on impactful projects with global experts, focusing on efficient development and problem-solving that makes a real difference. Job Responsibilities: Develop efficient Python code to address problems effectively Apply business acumen and analytical skills to extract meaningful insights from public databases Articulate reasoning and logic coherently when writing code in Jupyter notebooks or similar platforms Collaborate closely with researchers to exchange ideas and insights Maintain thorough documentation for all developed code Job Requirements: Open to applicants of all levels, from junior to industry experts Bachelor's degree in Engineering, Computer Science, or equivalent practical experience Good grasp of Python programming language for coding and debugging purposes Knowledge of databases (SQL/NoSQL) and cloud platforms (AWS, GCP, Azure) is a plus. A minimum 5-hour work overlap with PST/PT is required. Strong communication and teamwork skills in a remote setting. Perks: Work with top industry experts worldwide. This is a contractual remote work opportunity without traditional job constraints. Competitive salary aligned with global standards. Be part of cutting-edge, high-impact projects. Selection Process: Shortlisted developers may be asked to complete an assessment. If you clear the assessment, you will be contacted for contract assignments with expected start dates, durations, and end dates. Some contract assignments require fixed weekly hours, averaging 20/30/40 hours per week for the duration of the contract assignment. Show more Show less

Posted 2 days ago

Apply

0 years

0 Lacs

Hyderabad, Telangana, India

Remote

Linkedin logo

Join a leading U.S.-based company as a Python Developer , where you’ll play a key role in driving innovative solutions in technology. Use your Python skills to tackle meaningful challenges and work on impactful projects with global experts, focusing on efficient development and problem-solving that makes a real difference. Job Responsibilities: Develop efficient Python code to address problems effectively Apply business acumen and analytical skills to extract meaningful insights from public databases Articulate reasoning and logic coherently when writing code in Jupyter notebooks or similar platforms Collaborate closely with researchers to exchange ideas and insights Maintain thorough documentation for all developed code Job Requirements: Open to applicants of all levels, from junior to industry experts Bachelor's degree in Engineering, Computer Science, or equivalent practical experience Good grasp of Python programming language for coding and debugging purposes Knowledge of databases (SQL/NoSQL) and cloud platforms (AWS, GCP, Azure) is a plus. A minimum 5-hour work overlap with PST/PT is required. Strong communication and teamwork skills in a remote setting. Perks: Work with top industry experts worldwide. This is a contractual remote work opportunity without traditional job constraints. Competitive salary aligned with global standards. Be part of cutting-edge, high-impact projects. Selection Process: Shortlisted developers may be asked to complete an assessment. If you clear the assessment, you will be contacted for contract assignments with expected start dates, durations, and end dates. Some contract assignments require fixed weekly hours, averaging 20/30/40 hours per week for the duration of the contract assignment. Show more Show less

Posted 2 days ago

Apply

5.0 - 7.0 years

0 Lacs

Trivandrum, Kerala, India

On-site

Linkedin logo

Equifax is where you can power your possible. If you want to achieve your true potential, chart new paths, develop new skills, collaborate with bright minds, and make a meaningful impact, we want to hear from you. You are passionate about quality and how customers experience the products you test. You have the ability to create, maintain and execute test plans in order to verify requirements. As a Quality Engineer at Equifax, you will be a catalyst in both the development and the testing of high priority initiatives. You will develop and test new products to support technology operations while maintaining exemplary standards. As a collaborative member of the team, you will deliver QA services (code quality, testing services, performance engineering, development collaboration and continuous integration). You will conduct quality control tests in order to ensure full compliance with specified standards and end user requirements. You will execute tests using established plans and scripts; documents problems in an issues log and retest to ensure problems are resolved. You will create test files to thoroughly test program logic and verify system flow. You will identify, recommend and implement changes to enhance effectiveness of QA strategies. What You Will Do Independently develop scalable and reliable automated tests and frameworks for testing software solutions. Specify and automate test scenarios and test data for a highly complex business by analyzing integration points, data flows, personas, authorization schemes and environments Develop regression suites, develop automation scenarios, and move automation to an agile continuous testing model. Pro-actively and collaboratively taking part in all testing related activities while establishing partnerships with key stakeholders in Product, Development/Engineering, and Technology Operations. What Experience You Need Bachelor's degree in a STEM major or equivalent experience 5-7 years of software testing experience Able to create and review test automation according to specifications Ability to write, debug, and troubleshoot code in Java, Springboot, TypeScript/JavaScript, HTML, CSS Creation and use of big data processing solutions using Dataflow/Apache Beam, Bigtable, BigQuery, PubSub, GCS, Composer/Airflow, and others with respect to software validation Created test strategies and plans Led complex testing efforts or projects Participated in Sprint Planning as the Test Lead Collaborated with Product Owners, SREs, Technical Architects to define testing strategies and plans. Design and development of micro services using Java, Springboot, GCP SDKs, GKE/Kubeneties Deploy and release software using Jenkins CI/CD pipelines, understand infrastructure-as-code concepts, Helm Charts, and Terraform constructs Cloud Certification Strongly Preferred What Could Set You Apart An ability to demonstrate successful performance of our Success Profile skills, including: Attention to Detail - Define test case candidates for automation that are outside of product specifications. i.e. Negative Testing; Create thorough and accurate documentation of all work including status updates to summarize project highlights; validating that processes operate properly and conform to standards Automation - Automate defined test cases and test suites per project Collaboration - Collaborate with Product Owners and development team to plan and and assist with user acceptance testing; Collaborate with product owners, development leads and architects on functional and non-functional test strategies and plans Execution - Develop scalable and reliable automated tests; Develop performance testing scripts to assure products are adhering to the documented SLO/SLI/SLAs; Specify the need for Test Data types for automated testing; Create automated tests and tests data for projects; Develop automated regression suites; Integrate automated regression tests into the CI/CD pipeline; Work with teams on E2E testing strategies and plans against multiple product integration points Quality Control - Perform defect analysis, in-depth technical root cause analysis, identifying trends and recommendations to resolve complex functional issues and process improvements; Analyzes results of functional and non-functional tests and make recommendation for improvements; Performance / Resilience: Understanding application and network architecture as inputs to create performance and resilience test strategies and plans for each product and platform. Conducting the performance and resilience testing to ensure the products meet SLAs / SLOs Quality Focus - Review test cases for complete functional coverage; Review quality section of Production Readiness Review for completeness; Recommend changes to existing testing methodologies for effectiveness and efficiency of product validation; Ensure communications are thorough and accurate for all work documentation including status and project updates Risk Mitigation - Work with Product Owners, QE and development team leads to track and determine prioritization of defects fixes We offer a hybrid work setting, comprehensive compensation and healthcare packages, attractive paid time off, and organizational growth potential through our online learning platform with guided career tracks. Are you ready to power your possible? Apply today, and get started on a path toward an exciting new career at Equifax, where you can make a difference! Who is Equifax? At Equifax, we believe knowledge drives progress. As a global data, analytics and technology company, we play an essential role in the global economy by helping employers, employees, financial institutions and government agencies make critical decisions with greater confidence. We work to help create seamless and positive experiences during life’s pivotal moments: applying for jobs or a mortgage, financing an education or buying a car. Our impact is real and to accomplish our goals we focus on nurturing our people for career advancement and their learning and development, supporting our next generation of leaders, maintaining an inclusive and diverse work environment, and regularly engaging and recognizing our employees. Regardless of location or role, the individual and collective work of our employees makes a difference and we are looking for talented team players to join us as we help people live their financial best. Equifax is an Equal Opportunity Employer. All qualified applicants will receive consideration for employment without regard to race, color, religion, sex, sexual orientation, gender identity, national origin, disability, or status as a protected veteran. Show more Show less

Posted 2 days ago

Apply

7.0 years

0 Lacs

India

Remote

Linkedin logo

Business Analyst Lvl 4 India (Remote) Contract JOB ID: 23670 At Wimmer Solutions, we believe care creates community. We work smart; we have built a reputation for results-oriented, innovative, business and technology solutions that help companies execute on their strategic initiatives. We have fun; we love our work. We are positive, kind, and hungry to learn. We give big; we aim to make a real impact on the causes that affect the communities we serve and build strong relationships with the dedicated volunteers and nonprofit organizations working to address them. We are all about people and community. Since 2002, we have offered technology staffing and managed services for the greater Seattle area and throughout the United States. We focus on getting to know our clients and candidates to create lasting partnerships and ensure success. We are the Partner Integration Platform team, focused on building a robust enterprise platform that enables seamless integration between external third-party applications and internal systems. Our integration spans a wide range of business functions, including Product, Selection, Pricing, Inventory, Checkout, Ordering, Fulfillment, Reverse Logistics, Supply Chain, Seller Payments, and more. We continuously develop new features and experiences using modern technology stacks such as Java, Spring Boot, AWS, GCP, and message streaming platforms. We work in an agile, team-oriented, and collaborative environment that values innovation. Developers are encouraged to take full ownership of their work throughout the software development lifecycle. As a Sr. Business Analyst, you will help develop analytic solutions to drive deep dives, provide insights into the health and state of the Artemis platform. You will transform data for the millions of unique products involving thousands of Selling Partners and millions of customers into actionable business information, make it accessible to stakeholders. From Day 1, you will be challenged with a variety of tasks, ranging from creating datasets, reports, dashboards, and monitoring. You will interact with technical teams and product owners to gather requirements and gain a deep understanding of key datasets. You will write features, stories and ensure the backlog is healthy by providing clarity for the technical team to implement. You will be responsible to build Tableau reports when needed. WHAT YOU GET TO DO: Analyze Marketplace and Wholesale experience trends and build engagement metrics for each model. Identify opportunities to improve efficiency and reduce end to end complexities for Artemis platform. Develop Key Performance Indicators that help measure experimental efficacies. Engage with stakeholders across marketplace business teams, drop ship order management, and Product platform teams to socialize inbound and outbound KPIs. Design and implement reporting solutions to enable stakeholders to manage the business and make effective decisions. Monitor existent metrics, build new metrics, and partner with internal teams to identify process and system improvement opportunities. Work closely with Product Owners and Technical team to understand the requirement, document them and help writing features and stories. Help the tech team to maintain a healthy story backlog and help the team identifying gaps and provide them clarity by working with Product or Business partners. WHAT YOU BRING : 7+ years of business analysis, or a quantitative role experience 7+ years of writing SQL queries and creating business intelligence reports using Tableau, Power BI experience Experience defining requirements and using data and metrics to draw business insights, writing features and technical stories. Experience with Tableau dashboard and visual creation Bachelor's degree in operations, engineering, analytics or related field PREFERRED QUALIFICATIONS MBA (Master’s in Business Administration) Experience making business recommendations and influencing stakeholders MORE ABOUT WIMMER SOLUTIONS Wimmer Solutions is proud to be an equal-opportunity employer. All applicants will be considered for employment regardless of race, color, religion or belief, age, gender identity, sexual orientation, national origin, parental status, veteran, or disability status. Wimmer Solutions is committed to achieving a diverse employee network through all aspects of the hiring process and we welcome all applicants. If you are passionate about what you do and want to join a diverse team dedicated to diversity, equity, and inclusion in the workplace, we would love to hear from you. Get the job you have always wanted. You will join a broad team of professionals who are energized about their careers as well as their community. For more career opportunities or to refer a friend, please visit http://wimmersolutions.com/careers and talk to a recruiter today. Show more Show less

Posted 2 days ago

Apply

2.0 years

0 Lacs

Pune, Maharashtra, India

Remote

Linkedin logo

Ultimate.ai Software Engineering Pune, Maharashtra, India Posted on Jun 16, 2025 Apply now Job Description Note***: This is a hybrid role, combining remote and on-site work, requiring 3 days in the office, and relocation to Pune. We are looking for a seasoned software engineer to join the team that owns building Agent experience in support products. Someone who identifies as a JavaScript developer with good knowledge in frontend technologies and an interest in understanding back-end architecture for putting the holistic picture together. Ruby skills are a plus! Agent Workspace enables Zendesk Support agents to work seamlessly across Zendesk channels, all within a single ticket interface. It is a critical piece of how customers use Zendesk Support, the most important part of the overall product user experience, and fundamentally what makes our customers successful. As a team, we are a close-knit group that values inclusivity and diversity of backgrounds and opinions. We deliberately cultivate a highly collaborative and productive working style. The team has a proven history of developing highly reliable and scalable frontend and extending complex areas of the product. The features managed by our team powers some of the critical features in the support product. The work we do has a high impact on agents’ efficiency! What You Get To Do Every Day Drive the modernization and evolution of our largest monolithic application by building rich, scalable, and performant frontend features using React, Redux, TypeScript, and GraphQL. Collaborate closely with product managers, designers, and backend engineers to deliver seamless user experiences that solve real customer problems. Write clean, maintainable, and well-tested code to implement new features and enhance existing ones within our frontend stack. Participate actively in code reviews, pair programming, and design discussions to uphold high-quality engineering standards. Identify and resolve frontend performance bottlenecks, ensuring fast, responsive, and accessible user interfaces. Evangelize best practices in frontend development, including state management, component architecture, and testing strategies. What You Bring To The Role 2 to 4+ years of professional experience focused on frontend development, ideally in SaaS or complex web application environments. Strong expertise in JavaScript and TypeScript, with deep knowledge of React and its ecosystem (Redux, React Router, hooks, etc.). Experience working with GraphQL APIs and integrating them effectively into frontend applications. Familiarity with modern frontend tooling and build systems (Webpack, Babel, ESLint, etc.). Solid understanding of responsive design, cross-browser compatibility, and accessibility standards. Passion for writing clear, maintainable code and producing thorough automated tests (unit, integration, end-to-end). Excellent communication skills, able to articulate technical concepts clearly to both technical and non-technical stakeholders. Bonus: Experience with Ember.js, cloud platforms (AWS, GCP), or containerization technologies is a plus but not required. Bonus: Contributions to open source frontend projects or active participation in frontend communities. Please note that Zendesk can only hire candidates who are physically located and plan to work from Karnataka or Maharashtra. Please refer to the location posted on the requisition for where this role is based. Hybrid: In this role, our hybrid experience is designed at the team level to give you a rich onsite experience packed with connection, collaboration, learning, and celebration - while also giving you flexibility to work remotely for part of the week. This role must attend our local office for part of the week. The specific in-office schedule is to be determined by the hiring manager. The Intelligent Heart Of Customer Experience Zendesk software was built to bring a sense of calm to the chaotic world of customer service. Today we power billions of conversations with brands you know and love. Zendesk believes in offering our people a fulfilling and inclusive experience. Our hybrid way of working, enables us to purposefully come together in person, at one of our many Zendesk offices around the world, to connect, collaborate and learn whilst also giving our people the flexibility to work remotely for part of the week. Zendesk is an equal opportunity employer, and we’re proud of our ongoing efforts to foster global diversity, equity, & inclusion in the workplace. Individuals seeking employment and employees at Zendesk are considered without regard to race, color, religion, national origin, age, sex, gender, gender identity, gender expression, sexual orientation, marital status, medical condition, ancestry, disability, military or veteran status, or any other characteristic protected by applicable law. We are an AA/EEO/Veterans/Disabled employer. If you are based in the United States and would like more information about your EEO rights under the law, please click here. Zendesk endeavors to make reasonable accommodations for applicants with disabilities and disabled veterans pursuant to applicable federal and state law. If you are an individual with a disability and require a reasonable accommodation to submit this application, complete any pre-employment testing, or otherwise participate in the employee selection process, please send an e-mail to peopleandplaces@zendesk.com with your specific accommodation request. Apply now See more open positions at Ultimate.ai Show more Show less

Posted 2 days ago

Apply

5.0 - 7.0 years

0 Lacs

Pune, Maharashtra, India

On-site

Linkedin logo

Equifax is where you can power your possible. If you want to achieve your true potential, chart new paths, develop new skills, collaborate with bright minds, and make a meaningful impact, we want to hear from you. You are passionate about quality and how customers experience the products you test. You have the ability to create, maintain and execute test plans in order to verify requirements. As a Quality Engineer at Equifax, you will be a catalyst in both the development and the testing of high priority initiatives. You will develop and test new products to support technology operations while maintaining exemplary standards. As a collaborative member of the team, you will deliver QA services (code quality, testing services, performance engineering, development collaboration and continuous integration). You will conduct quality control tests in order to ensure full compliance with specified standards and end user requirements. You will execute tests using established plans and scripts; documents problems in an issues log and retest to ensure problems are resolved. You will create test files to thoroughly test program logic and verify system flow. You will identify, recommend and implement changes to enhance effectiveness of QA strategies. What You Will Do Independently develop scalable and reliable automated tests and frameworks for testing software solutions. Specify and automate test scenarios and test data for a highly complex business by analyzing integration points, data flows, personas, authorization schemes and environments Develop regression suites, develop automation scenarios, and move automation to an agile continuous testing model. Pro-actively and collaboratively taking part in all testing related activities while establishing partnerships with key stakeholders in Product, Development/Engineering, and Technology Operations. What Experience You Need Bachelor's degree in a STEM major or equivalent experience 5-7 years of software testing experience Able to create and review test automation according to specifications Ability to write, debug, and troubleshoot code in Java, Springboot, TypeScript/JavaScript, HTML, CSS Creation and use of big data processing solutions using Dataflow/Apache Beam, Bigtable, BigQuery, PubSub, GCS, Composer/Airflow, and others with respect to software validation Created test strategies and plans Led complex testing efforts or projects Participated in Sprint Planning as the Test Lead Collaborated with Product Owners, SREs, Technical Architects to define testing strategies and plans. Design and development of micro services using Java, Springboot, GCP SDKs, GKE/Kubeneties Deploy and release software using Jenkins CI/CD pipelines, understand infrastructure-as-code concepts, Helm Charts, and Terraform constructs Cloud Certification Strongly Preferred What Could Set You Apart An ability to demonstrate successful performance of our Success Profile skills, including: Attention to Detail - Define test case candidates for automation that are outside of product specifications. i.e. Negative Testing; Create thorough and accurate documentation of all work including status updates to summarize project highlights; validating that processes operate properly and conform to standards Automation - Automate defined test cases and test suites per project Collaboration - Collaborate with Product Owners and development team to plan and and assist with user acceptance testing; Collaborate with product owners, development leads and architects on functional and non-functional test strategies and plans Execution - Develop scalable and reliable automated tests; Develop performance testing scripts to assure products are adhering to the documented SLO/SLI/SLAs; Specify the need for Test Data types for automated testing; Create automated tests and tests data for projects; Develop automated regression suites; Integrate automated regression tests into the CI/CD pipeline; Work with teams on E2E testing strategies and plans against multiple product integration points Quality Control - Perform defect analysis, in-depth technical root cause analysis, identifying trends and recommendations to resolve complex functional issues and process improvements; Analyzes results of functional and non-functional tests and make recommendation for improvements; Performance / Resilience: Understanding application and network architecture as inputs to create performance and resilience test strategies and plans for each product and platform. Conducting the performance and resilience testing to ensure the products meet SLAs / SLOs Quality Focus - Review test cases for complete functional coverage; Review quality section of Production Readiness Review for completeness; Recommend changes to existing testing methodologies for effectiveness and efficiency of product validation; Ensure communications are thorough and accurate for all work documentation including status and project updates Risk Mitigation - Work with Product Owners, QE and development team leads to track and determine prioritization of defects fixes We offer a hybrid work setting, comprehensive compensation and healthcare packages, attractive paid time off, and organizational growth potential through our online learning platform with guided career tracks. Are you ready to power your possible? Apply today, and get started on a path toward an exciting new career at Equifax, where you can make a difference! Who is Equifax? At Equifax, we believe knowledge drives progress. As a global data, analytics and technology company, we play an essential role in the global economy by helping employers, employees, financial institutions and government agencies make critical decisions with greater confidence. We work to help create seamless and positive experiences during life’s pivotal moments: applying for jobs or a mortgage, financing an education or buying a car. Our impact is real and to accomplish our goals we focus on nurturing our people for career advancement and their learning and development, supporting our next generation of leaders, maintaining an inclusive and diverse work environment, and regularly engaging and recognizing our employees. Regardless of location or role, the individual and collective work of our employees makes a difference and we are looking for talented team players to join us as we help people live their financial best. Equifax is an Equal Opportunity Employer. All qualified applicants will receive consideration for employment without regard to race, color, religion, sex, sexual orientation, gender identity, national origin, disability, or status as a protected veteran. Show more Show less

Posted 2 days ago

Apply

3.0 - 7.0 years

8 - 17 Lacs

Bengaluru

Work from Office

Naukri logo

Job Title: Associate Consultant/ Consultant /Assistant Manager - Cyber Security Auditor Location: Bangalore Skills Required: Seeking a highly skilled Cyber Security Auditor with expertise in auditing cyber security Process, risks and controls. A strong understanding of industry frameworks such as NIST (e.g., NIST CSF, NIST 800-53) and hands-on experience in assessing cybersecurity risks, governance controls, and technical security measures. This role involves validating control effectiveness, performing closure verification/issue validation to strengthen cyber security posture. Responsibilities: Conduct assessments of cyber security risk and controls across network security, application security, vulnerability management, and governance controls. Perform closure verification and issue validation for security findings, ensuring remediation aligns with risk reduction objectives. Evaluate vulnerability management programs, patch management processes, and threat intelligence integration. Review and test governance controls related to cyber security policies. Strong understanding of NIST frameworks (CSF, 800-53), ISO 27001, CIS Controls, and regulatory requirements. Technical expertise in network security, firewalls, intrusion detection/prevention systems (IDS/IPS), SIEM tools, and endpoint security. Hands-on experience in application security, vulnerability management, patch management, and security monitoring. Strong knowledge of network protocols (TCP/IP, HTTP, SSL/TLS, DNS, VPN, etc.) and secure configurations. Familiarity with cloud security controls (AWS, Azure, GCP) and DevSecOps principles. Professional certifications such as CISA, CISSP, CISM, CRISC, CEH, or GIAC certifications (GCIH, GCFA, GPEN) are highly desirable. Stay up to date with emerging cyber threats, attack techniques, and regulatory requirements impacting security controls. Qualification: A Bachelor's degree in engineering and approximately 3 -6 years of related work experience; or a masters or MBA degree in business, computer science, information systems, engineering Technical Knowledge of IT Audit Tools A strong understanding of industry frameworks such as NIST (e.g., NIST CSF, NIST 800-53) Hands-on experience in assessing cybersecurity risks, governance controls, and technical security measures

Posted 2 days ago

Apply

Exploring GCP Jobs in India

The job market for Google Cloud Platform (GCP) professionals in India is rapidly growing as more and more companies are moving towards cloud-based solutions. GCP offers a wide range of services and tools that help businesses in managing their infrastructure, data, and applications in the cloud. This has created a high demand for skilled professionals who can work with GCP effectively.

Top Hiring Locations in India

  1. Bangalore
  2. Mumbai
  3. Hyderabad
  4. Pune
  5. Chennai

Average Salary Range

The average salary range for GCP professionals in India varies based on experience and job role. Entry-level positions can expect a salary range of INR 5-8 lakhs per annum, while experienced professionals can earn anywhere from INR 12-25 lakhs per annum.

Career Path

Typically, a career in GCP progresses from a Junior Developer to a Senior Developer, then to a Tech Lead position. As professionals gain more experience and expertise in GCP, they can move into roles such as Cloud Architect, Cloud Consultant, or Cloud Engineer.

Related Skills

In addition to GCP, professionals in this field are often expected to have skills in: - Cloud computing concepts - Programming languages such as Python, Java, or Go - DevOps tools and practices - Networking and security concepts - Data analytics and machine learning

Interview Questions

  • What is Google Cloud Platform and its key services? (basic)
  • Explain the difference between Google Cloud Storage and Google Cloud Bigtable. (medium)
  • How would you optimize costs in Google Cloud Platform? (medium)
  • Describe a project where you implemented CI/CD pipelines in GCP. (advanced)
  • How does Google Cloud Pub/Sub work and when would you use it? (medium)
  • What is Cloud Spanner and how is it different from other database services in GCP? (advanced)
  • Explain the concept of IAM and how it is implemented in GCP. (medium)
  • How would you securely transfer data between different regions in GCP? (advanced)
  • What is Google Kubernetes Engine (GKE) and how does it simplify container management? (medium)
  • Describe a scenario where you used Google Cloud Functions in a project. (advanced)
  • How do you monitor performance and troubleshoot issues in GCP? (medium)
  • What is Google Cloud SQL and when would you choose it over other database options? (medium)
  • Explain the concept of VPC (Virtual Private Cloud) in GCP. (basic)
  • How do you ensure data security and compliance in GCP? (medium)
  • Describe a project where you integrated Google Cloud AI services. (advanced)
  • What is the difference between Google Cloud CDN and Google Cloud Load Balancing? (medium)
  • How do you handle disaster recovery and backups in GCP? (medium)
  • Explain the concept of auto-scaling in GCP and when it is useful. (medium)
  • How would you set up a multi-region deployment in GCP for high availability? (advanced)
  • Describe a project where you used Google Cloud Dataflow for data processing. (advanced)
  • What are the best practices for optimizing performance in Google Cloud Platform? (medium)
  • How do you manage access control and permissions in GCP? (medium)
  • Explain the concept of serverless computing and how it is implemented in GCP. (medium)
  • What is the difference between Google Cloud Identity and Access Management (IAM) and AWS IAM? (advanced)
  • How do you ensure data encryption at rest and in transit in GCP? (medium)

Closing Remark

As the demand for GCP professionals continues to rise in India, now is the perfect time to upskill and pursue a career in this field. By mastering GCP and related skills, you can unlock numerous opportunities and build a successful career in cloud computing. Prepare well, showcase your expertise confidently, and land your dream job in the thriving GCP job market in India.

cta

Start Your Job Search Today

Browse through a variety of job opportunities tailored to your skills and preferences. Filter by location, experience, salary, and more to find your perfect fit.

Job Application AI Bot

Job Application AI Bot

Apply to 20+ Portals in one click

Download Now

Download the Mobile App

Instantly access job listings, apply easily, and track applications.

Featured Companies