Home
Jobs

558 Composer Jobs - Page 18

Filter
Filter Interviews
Min: 0 years
Max: 25 years
Min: ₹0
Max: ₹10000000
Setup a job Alert
JobPe aggregates results for easy application access, but you actually apply on the job portal directly.

5.0 years

0 Lacs

Pune, Maharashtra, India

On-site

Linkedin logo

Experience: 5-12 years Primary Skills: GCP, Big query, Python Notice period: Immediate to 30 days Overall 5+ years of experience in data eng Strong Experience with GCP services like Big Query Cloud Storage composer etc Hands on experience in Python Proficient in Pyspark databricks Strong understanding of data warehouse concepts ETL process and data modelling GCP certifications are added advantage Show more Show less

Posted 3 weeks ago

Apply

5.0 years

0 Lacs

Noida, Uttar Pradesh, India

On-site

Linkedin logo

Experience: 5-12 years Primary Skills: GCP, Big query, Python Notice period: Immediate to 30 days Overall 5+ years of experience in data eng Strong Experience with GCP services like Big Query Cloud Storage composer etc Hands on experience in Python Proficient in Pyspark databricks Strong understanding of data warehouse concepts ETL process and data modelling GCP certifications are added advantage Show more Show less

Posted 3 weeks ago

Apply

5.0 years

0 Lacs

Noida, Uttar Pradesh, India

Remote

Linkedin logo

GCP Data Engineer On-Premises to Cloud SQL Migration Experience- 5-8 Yrs Location- Noida Notice period - Immediate /serving Work Mode: WFO/Remote/Hybrid (depends on client’s ask) Budget: open budget DATA ENGINEER IV 5-8 years of experience GCP Data Engineer - On-Premises to Cloud SQL Migration Job Description: • As a Data Engineer with a focus on migrating on-premises databases to Google Cloud SQL, you will play a critical role in solving complex problems and creating value for our business by ensuring reliable, scalable, and efficient data migration processes. You will be responsible for architecting, designing and implementing custom pipelines on the GCP stack to facilitate seamless migration. Required Skills: •5+ years of industry experience in data engineering, business intelligence, or a related field with experience in manipulating, processing, and extracting value from datasets. •Expertise in architecting, designing, building, and deploying internal applications to support technology life cycle management, service delivery management, data, and business intelligence. •Experience in developing modular code for versatile pipelines or complex ingestion frameworks aimed at loading data into Cloud SQL and managing data migration from multiple on-premises sources. • Strong collaboration with analysts and business process owners to translate business requirements into technical solutions. •Proficiency in coding with scripting languages (Shell scripting, Python, SQL). •Deep understanding and hands-on experience with Google Cloud Platform (GCP) technologies, especially in data migration and warehousing, including Database Migration Service (DMS), Cloud SQL, Big Query, Dataflow, Data Catalog, Cloud Composer, Google Cloud Storage (GCS), IAM, Compute Engine, Cloud Data Fusion, and optionally Dataproc. •Adherence to best development practices including technical design, solution development, systems configuration, test documentation/execution, issue identification and resolution, and writing clean, modular, self-sustaining code. •Familiarity with CI/CD processes using GitHub, Cloud Build, and Google Cloud SDK. Qualifications: •Bachelor's degree in Computer Science or a related technical field, or equivalent practical experience. •GCP Certified Data Engineer (preferred). •Excellent verbal and written communication skills with the ability to effectively advocate technical solutions to research scientists, engineering teams, and business audiences .Bachelor's degree in Computer Science or a related technical field, or equivalent practical experience Show more Show less

Posted 3 weeks ago

Apply

15.0 years

0 Lacs

India

Remote

Linkedin logo

Job Title: Data Engineer Lead - AEP Location:Remote Experience Required: 12–15 years overall experience 8+ years in Data Engineering 5+ years leading Data Engineering teams Cloud migration & consulting experience (GCP preferred) Job Summary: We are seeking a highly experienced and strategic Lead Data Engineer with a strong background in leading data engineering teams, modernizing data platforms, and migrating ETL pipelines and data warehouses to Google Cloud Platform (GCP) . You will work directly with enterprise clients, architecting scalable data solutions, and ensuring successful delivery in high-impact environments. Key Responsibilities: Lead end-to-end data engineering projects including cloud migration of legacy ETL pipelines and Data Warehouses to GCP (BigQuery) . Design and implement modern ELT/ETL architectures using Dataform , Dataplex , and other GCP-native services. Provide strategic consulting to clients on data platform modernization, governance, and data quality frameworks. Collaborate with cross-functional teams including data scientists, analysts, and business stakeholders. Define and enforce data engineering best practices , coding standards, and CI/CD processes. Mentor and manage a team of data engineers; foster a high-performance, collaborative team culture. Monitor project progress, ensure delivery timelines, and manage client expectations. Engage in technical pre-sales and solutioning , driving excellence in consulting delivery. Technical Skills & Tools: Cloud Platforms: Strong experience with Google Cloud Platform (GCP) – particularly BigQuery , Dataform , Dataplex , Cloud Composer , Cloud Storage , Pub/Sub . ETL/ELT Tools: Apache Airflow, Dataform, dbt (if applicable). Languages: Python, SQL, Shell scripting. Data Warehousing: BigQuery, Snowflake (optional), traditional DWs (e.g., Teradata, Oracle). DevOps: Git, CI/CD pipelines, Docker. Data Modeling: Dimensional modeling, Data Vault, star/snowflake schemas. Data Governance & Lineage: Dataplex, Collibra, or equivalent tools. Monitoring & Logging: Stackdriver, DataDog, or similar. Preferred Qualifications: Proven consulting experience with premium clients or Tier 1 consulting firms. Hands-on experience leading large-scale cloud migration projects . GCP Certification(s) (e.g., Professional Data Engineer, Cloud Architect). Strong client communication, stakeholder management, and leadership skills. Experience with agile methodologies and project management tools like JIRA. Show more Show less

Posted 3 weeks ago

Apply

4.0 - 6.0 years

0 Lacs

Hyderabad, Telangana, India

On-site

Linkedin logo

Job Description Job Purpose ICE Data Services India Private Limited., a subsidiary of Intercontinental Exchange presents a unique opportunity to work with cutting-edge technology and business challenges in the financial sector. ICE team members work across departments and traditional boundaries to innovate and respond to industry demand. A successful candidate will be able to multitask in a dynamic team-based environment demonstrating strong problem-solving and decision-making abilities and the highest degree of professionalism. Engineer, Enterprise Endpoint Solutions is part of the team responsible for the global corporate endpoint computing environment. This position is specifically charged with the management and maintenance of the workstation environment for all ICE, Inc. and subsidiary companies. This position requires strong technical proficiency with a range of enterprise tools, as well as an eager attitude, professionalism, and solid communication skills. Responsibilities Must have Workstation OS - Familiarity with Windows OS. Application Packaging - Experience in terms of building the applications as part of MSI, MST using Flexera Admin Studio or PSADT (PowerShell Deployment Toolkit). Having exposure on packaging applications using tools such MS Intune and JAMF will be an added advantage for this role. Desktop Management Systems - Experience with building and managing desktop management infrastructure using tools such as MECM, MS Intune, Tainum Core Management, Autopilot, Azure. Image Management -Experience with Capturing the Image using MDT or PXE methodology and create task sequence and deploy them through SCCM, MECM. Patch Management - Software and OS patch management using SCCM, Cloud Management Gateway (CMG), Windows Evergreen, Windows Update for Business etc. Mobile Device Management: Need experience in terms of managing mobile devices Android, Windows, Apple using varies management tools such as Intune, AirWatch, JAMF etc. Scripting - Strong scripting skills in VBScript, batch and PowerShell required, with advanced knowledge of PowerShell for management and automation a big plus. Should be comfortable performing tasks from the command line. Good to have Workstation OS - Strong experience MAC OS with familiarity with UNIX shell scripting with SH, BASH, and ZSH, or Python MAC OS Tech - Familiarity with JAMF a plus. Experience with JAMF Composer or Visual Studio Code and Munkipkg Knowledge And Experience Candidate with more than 4-6 years of experience in terms of Desktop Management role using varies MACS OS technologies like JAMF, Unix, Python, BASH, ZSH & Microsoft Technologies such as MECM, MS Intune, Azure etc. College degree in Engineering, MIS, CIS, or related discipline preferred. Familiarity in creating test, migration, and delivery plans for delivering software and security patches globally. Familiarity with Service Now and Change/Incident Management practices. Experience in the Financial Services industry a plus. Experience in managing VDI environment (VMware Horizon, Citrix) will be plus. Show more Show less

Posted 3 weeks ago

Apply

26.0 years

0 Lacs

Hyderabad, Telangana, India

On-site

Linkedin logo

Location : Hyderabad Employment Type : Fulltime About us : vSplash is a digital services provider with over 26 years of experience in delivering innovative, high-quality, and scalable solutions to small businesses. We specialize in website development, SEO, e-commerce, and other digital marketing services, ensuring fast and cost-effective digitization. We do white labelled services at large scale building 1500+ websites every month. By focusing on innovation and precision-driven processes, we help businesses enhance their digital presence and stay ahead in a rapidly changing digital landscape. We take pride in having the capability to build WordPress websites of about 5 pages in 6 to 8 hours. About the Role : As an Associate Manager / Manager – Website Operations, you will provide technical and strategic guidance to team members and their supervisors in alignment with the customer’s expectations and requirements. You will be responsible for overseeing the operational progress, responsible for optimizing team performance, implementing efficient workflows, maintaining high standards of quality, SLA adherence and client satisfaction. The role demands flexibility, result-driven approach and acceptance to AI initiations for effectiveness of the process. Key Responsibilities: Operations Management: Manage and direct the operations team to achieve business targets and meet service-level agreements (SLAs) Process Development: Create and improve the standard operating procedures for all operational activities wherever and whenever required Client Communication: Handle daily, weekly, and monthly client calls, address escalations, and ensure excellent client satisfaction Team Management: Lead and collaborate with a large team, handling conflict situations, leave and people management and overall resource utilization and allocation Technical Assistance: Must be responsible for providing technical assistance to the supervisors and work towards continuous regularization of team’s efficiency Performance Monitoring: Conduct regular performance reviews and provide feedback Requirements : Total Experience 8+ years. Must have 5+ years of hands-on working experience on popular WordPress page builders such as Beaver Builder, Elementor, Visual Composer, WP Bakery, and Divi Leadership: At least 3+ years of proven track record of managing and motivating large teams Communication: Exceptional verbal and written communication skills, coupled with strong aptitude for delivering engaging and informative presentations Client Engagement: 2+ years of Experience in directly handling international client communication, understanding requirements and managing escalations Project Management: Skills to develop and maintain project plans, timelines, and resource allocation using project management tools (JIRA/Salesforce) Problem-Solving: Analytical mindset with a data-driven approach of resolving operational challenges Flexibility: Ability to work and respond to client needs around the clock, adapting to business requirements Continuous Improvement: A focus on continuous production and quality improvement, driving operational excellence Soft Skills Adaptability: Ability to keep up with fast-evolving digital trends and technologies in the space of automation Innovation: Encouraging new ideas to enhance digital solutions and drive business growth Client centric Communication: Understanding client needs and providing effective solutions that improve their digital presence Collaboration: Working efficiently with cross-functional teams to deliver high-quality and scalable results THE CANDIDATE MUST BE BASED IN HYDERABAD Powered by JazzHR N5J6h6WcTl Show more Show less

Posted 3 weeks ago

Apply

8.0 years

0 Lacs

Hyderabad, Telangana, India

On-site

Linkedin logo

About McDonald’s: One of the world’s largest employers with locations in more than 100 countries, McDonald’s Corporation has corporate opportunities in Hyderabad. Our global offices serve as dynamic innovation and operations hubs, designed to expand McDonald's global talent base and in-house expertise. Our new office in Hyderabad will bring together knowledge across business, technology, analytics, and AI, accelerating our ability to deliver impactful solutions for the business and our customers across the globe. Position Summary: Data Platform Lead / Sr Manager, Platform Engineer As the Data Platform Lead/ Sr Manager, Platform Engineer , you will be responsible for leading the design, scalability, and reliability of the enterprise data analytics platform. You will drive secure, efficient, and resilient data transfer and platform enablement services across a global organization. This role is pivotal in ensuring that analytics-ready data is accessible, governed, and delivered at scale to support decision-making, reporting, and advanced data products—particularly in high-volume, fast-paced environments like retail or QSR. Who we’re looking for: Primary Responsibilities: Platform Strategy & Operations Architect and manage scalable batch processing and data transfer pipelines to serve enterprise-wide analytics use cases. Continuously monitor platform health and optimize for performance, cost efficiency, and uptime. Implement platform observability, diagnostics, and incident response mechanisms to maintain service excellence. Security, Compliance & Governance Ensure secure handling of data across ingestion, transfer, and processing stages, adhering to enterprise and regulatory standards. Establish protocols for secure, compliant, and auditable data movement and transformation. Enablement & Support Provide Level 2/3 technical support for platform services, minimizing disruption and accelerating issue resolution. Drive user enablement by leading documentation efforts, publishing platform standards, and hosting training sessions. Collaborate with key business stakeholders to improve platform adoption and usability. Collaboration & Delivery Partner with product, engineering, and analytics teams to support data initiatives across domains and markets. Ensure the platform supports reliable analytics workflows through automation, integration, and governed data access. Oversee platform releases, upgrades, and maintenance activities to ensure minimal downtime and seamless user experience. Skills: 8+ years of experience in platform engineering, data infrastructure, or analytics technology environments. Deep expertise in: Batch processing and data orchestration tools (e.g., Airflow, Dataflow, Composer) Secure data transfer protocols (e.g., SFTP, API-based, event streaming) Advanced SQL for diagnostics and analytics enablement Python for automation, scripting, and platform utilities Experience with cloud-native platforms (preferably GCP or AWS), including infrastructure-as-code (leveraging Terraform and Ansible) and DevOps tooling. Proven leadership of cross-functional technical teams delivering high-scale, high-availability platform solutions. Excellent collaboration, communication, and stakeholder engagement skills. Bachelor's or Master’s degree in Computer Science, Information Systems, or a related technical field. GCP/AWS certification is preferred Work location: Hyderabad, India Work pattern: Full time role. Work mode: Hybrid. Additional Information: McDonald’s is committed to providing qualified individuals with disabilities with reasonable accommodations to perform the essential functions of their jobs. McDonald’s provides equal employment opportunities to all employees and applicants for employment and prohibits discrimination and harassment of any type without regard to sex, sex stereotyping, pregnancy (including pregnancy, childbirth, and medical conditions related to pregnancy, childbirth, or breastfeeding), race, color, religion, ancestry or national origin, age, disability status, medical condition, marital status, sexual orientation, gender, gender identity, gender expression, transgender status, protected military or veteran status, citizenship status, genetic information, or any other characteristic protected by federal, state or local laws. This policy applies to all terms and conditions of employment, including recruiting, hiring, placement, promotion, termination, layoff, recall, transfer, leaves of absence, compensation and training. McDonald’s Capability Center India Private Limited (“McDonald’s in India”) is a proud equal opportunity employer and is committed to hiring a diverse workforce and sustaining an inclusive culture. At McDonald’s in India, employment decisions are based on merit, job requirements, and business needs, and all qualified candidates are considered for employment. McDonald’s in India does not discriminate based on race, religion, color, age, gender, marital status, nationality, ethnic origin, sexual orientation, political affiliation, veteran status, disability status, medical history, parental status, genetic information, or any other basis protected under state or local laws. Nothing in this job posting or description should be construed as an offer or guarantee of employment. Show more Show less

Posted 3 weeks ago

Apply

5.0 years

0 Lacs

Hyderabad, Telangana, India

On-site

Linkedin logo

About McDonald’s: One of the world’s largest employers with locations in more than 100 countries, McDonald’s Corporation has corporate opportunities in Hyderabad. Our global offices serve as dynamic innovation and operations hubs, designed to expand McDonald's global talent base and in-house expertise. Our new office in Hyderabad will bring together knowledge across business, technology, analytics, and AI, accelerating our ability to deliver impactful solutions for the business and our customers across the globe. Position Summary: We’re seeking a hands-on Platform Engineer to support our enterprise data integration and enablement platform. As a Platform Engineer III, you’ll be responsible for designing, maintaining, and optimizing secure and scalable data movement services—such as batch processing, file transfers, and data orchestration. This role is essential to ensuring reliable data flow across systems to power analytics, reporting, and platform services in a cloud-native environment. Who we’re looking for: Primary Responsibilities: Hands-On Data Integration Engineering Build and maintain data transfer pipelines, file ingestion processes, and batch workflows for internal and external data sources. Configure and manage platform components that enable secure, auditable, and resilient data movement. Automate routine data processing tasks to improve reliability and reduce manual intervention. Platform Operations & Monitoring Monitor platform services for performance, availability, and failures; respond quickly to disruptions. Tune system parameters and job schedules to improve throughput and processing efficiency. Implement logging, metrics, and alerting to ensure end-to-end observability of data workflows. Security, Compliance & Support Apply secure protocols and encryption standards to data transfer processes (e.g., SFTP, HTTPS, GCS/AWS). Support compliance with internal controls and external regulations (e.g., GDPR, SOC2, PCI). Collaborate with security and infrastructure teams to manage access controls, service patches, and incident response. Troubleshooting & Documentation Investigate and resolve issues related to data processing failures, delays, or quality anomalies. Document system workflows, configurations, and troubleshooting runbooks for team use. Provide support for platform users and participate in on-call rotations as needed. Skill: 5+ years of hands-on experience in data integration , platform engineering , or infrastructure operations . Proficiency in: Designing and supporting batch and file-based data transfers Python scripting and SQL for diagnostics, data movement, and automation Terraform scripting and deploying of infrastructure cloud services Working with GCP (preferred) or AWS data analytics services, such as: GCP: Cloud Storage, BigQuery, Cloud Composer, Pub / Sub, Dataflow AWS: S3, Glue, Redshift, Athena, Lambda, EventBridge, Step Functions Cloud-native storage and compute optimization for data movement and processing Infrastructure-as-code and CI/CD practices (e.g., Terraform, Ansible, Cloud Build, GitHub Actions) Strong analytical and debugging skills for troubleshooting issues in distributed, high-volume environments. Bachelor's degree in computer science, Information Systems, or a related technical field. Work location: Hyderabad, India Work pattern: Full time role. Work mode: Hybrid. Additional Information: McDonald’s is committed to providing qualified individuals with disabilities with reasonable accommodations to perform the essential functions of their jobs. McDonald’s provides equal employment opportunities to all employees and applicants for employment and prohibits discrimination and harassment of any type without regard to sex, sex stereotyping, pregnancy (including pregnancy, childbirth, and medical conditions related to pregnancy, childbirth, or breastfeeding), race, color, religion, ancestry or national origin, age, disability status, medical condition, marital status, sexual orientation, gender, gender identity, gender expression, transgender status, protected military or veteran status, citizenship status, genetic information, or any other characteristic protected by federal, state or local laws. This policy applies to all terms and conditions of employment, including recruiting, hiring, placement, promotion, termination, layoff, recall, transfer, leaves of absence, compensation and training. McDonald’s Capability Center India Private Limited (“McDonald’s in India”) is a proud equal opportunity employer and is committed to hiring a diverse workforce and sustaining an inclusive culture. At McDonald’s in India, employment decisions are based on merit, job requirements, and business needs, and all qualified candidates are considered for employment. McDonald’s in India does not discriminate based on race, religion, color, age, gender, marital status, nationality, ethnic origin, sexual orientation, political affiliation, veteran status, disability status, medical history, parental status, genetic information, or any other basis protected under state or local laws. Nothing in this job posting or description should be construed as an offer or guarantee of employment. Show more Show less

Posted 3 weeks ago

Apply

8.0 years

0 Lacs

Hyderabad, Telangana, India

On-site

Linkedin logo

About McDonald’s: One of the world’s largest employers with locations in more than 100 countries, McDonald’s Corporation has corporate opportunities in Hyderabad. Our global offices serve as dynamic innovation and operations hubs, designed to expand McDonald's global talent base and in-house expertise. Our new office in Hyderabad will bring together knowledge across business, technology, analytics, and AI, accelerating our ability to deliver impactful solutions for the business and our customers across the globe. Position Summary: Platform Engineering & Enablement Services Lead / Sr Manager, Platform Engineer As the Platform Engineering & Enablement Services Lead/ Sr. Manager you will be responsible for driving the reliability, performance, and enablement of platform services that support secure data operations and analytics workflows across the enterprise. You will lead a team that designs, builds, and supports the infrastructure and tools necessary for seamless data movement, integration, and user enablement—ensuring agility, compliance, and operational excellence at scale. Who we’re looking for: Primary Responsibilities: Platform Operations & Engineering Design, implement, and maintain high-performing batch processing and file transfer systems across the enterprise. Monitor and tune platform performance to ensure availability, reliability, and cost-effectiveness. Partners with IT and infrastructure teams to plan and coordinate system updates, patching, and platform upgrades. Enablement & User Support Develop platform enablement services including onboarding, documentation, knowledge bases, and office hours. Facilitate training sessions and create self-service models to improve platform usability and adoption. Provide Tier 2/3 support for platform-related issues and coordinate escalations where needed. Security & Governance Ensure platform services meet internal and external security, privacy, and compliance standards (e.g., SOC2, PCI, GDPR). Manage secure data transfer protocols and implement robust access control mechanisms. Cross-Functional Collaboration Act as a liaison between platform engineering, data analytics, architecture, and security teams to ensure alignment. Drive platform automation, CI/CD enablement, and developer productivity initiatives across teams. Documentation & Standards Maintain comprehensive documentation of platform configurations, best practices, and incident runbooks. Standardize engineering processes to support consistency, observability, and ease of scaling across environments. Skill: 8+ years of experience in platform engineering, infrastructure operations, or enterprise systems enablement. Strong technical expertise in: Batch processing frameworks and orchestration tools (e.g., Airflow, Composer, Dataflow) Proficiency in infrastructure scripting for cloud services e.g. Terraform Secure data movement protocols (e.g., SFTP, APIs, messaging queues) Infrastructure engineering, networking, and system integration Proficiency in SQL and Python for troubleshooting, scripting, and platform automation Demonstrated experience in designing and supporting platform services in cloud-native environments (GCP or AWS preferred). Proven leadership in managing technical teams and platform support functions. Excellent communication, cross-team collaboration, and stakeholder management skills. Bachelor's or Master's degree in Computer Science, Information Systems, or related field. Work location: Hyderabad, India Work pattern: Full time role. Work mode: Hybrid. Additional Information: McDonald’s is committed to providing qualified individuals with disabilities with reasonable accommodations to perform the essential functions of their jobs. McDonald’s provides equal employment opportunities to all employees and applicants for employment and prohibits discrimination and harassment of any type without regard to sex, sex stereotyping, pregnancy (including pregnancy, childbirth, and medical conditions related to pregnancy, childbirth, or breastfeeding), race, color, religion, ancestry or national origin, age, disability status, medical condition, marital status, sexual orientation, gender, gender identity, gender expression, transgender status, protected military or veteran status, citizenship status, genetic information, or any other characteristic protected by federal, state or local laws. This policy applies to all terms and conditions of employment, including recruiting, hiring, placement, promotion, termination, layoff, recall, transfer, leaves of absence, compensation and training. McDonald’s Capability Center India Private Limited (“McDonald’s in India”) is a proud equal opportunity employer and is committed to hiring a diverse workforce and sustaining an inclusive culture. At McDonald’s in India, employment decisions are based on merit, job requirements, and business needs, and all qualified candidates are considered for employment. McDonald’s in India does not discriminate based on race, religion, color, age, gender, marital status, nationality, ethnic origin, sexual orientation, political affiliation, veteran status, disability status, medical history, parental status, genetic information, or any other basis protected under state or local laws. Nothing in this job posting or description should be construed as an offer or guarantee of employment. Show more Show less

Posted 3 weeks ago

Apply

5.0 years

0 Lacs

Hyderabad, Telangana, India

On-site

Linkedin logo

About McDonald’s: One of the world’s largest employers with locations in more than 100 countries, McDonald’s Corporation has corporate opportunities in Hyderabad. Our global offices serve as dynamic innovation and operations hubs, designed to expand McDonald's global talent base and in-house expertise. Our new office in Hyderabad will bring together knowledge across business, technology, analytics, and AI, accelerating our ability to deliver impactful solutions for the business and our customers across the globe. Position Summary: Data Platform Support / Platform Engineer III As a Data Platform Support Engineer, you will design, implement, and maintain scalable and secure platform services that enable efficient data transfers, API integrations, and platform operations across the enterprise. You will play a critical role in ensuring the performance, reliability, and security of data pipelines and platform services that power decision-making, reporting, and advanced data products—particularly in high-volume, fast-paced global environments. Who we’re looking for: Primary Responsibilities: Platform Operations & Support: Design, implement, and maintain secure and efficient platform solutions to support APIs, data transfers, and analytics workflows. Monitor platform health, optimize for performance and uptime, and implement diagnostic and observability practices. Provide Level 2/3 technical support for platform services to minimize disruption and accelerate issue resolution. Support platform upgrades, releases, and maintenance activities to ensure seamless user experiences. Security, Compliance & Governance: Ensure secure handling of data across transfer and integration processes, adhering to enterprise security and compliance standards. Implement and manage data transfer protocols (e.g., SFTP, API-based integrations) in a secure and auditable manner. Enablement & Documentation: Contribute to platform documentation, standards, and usage guidelines to drive user enablement and platform adoption. Support training sessions and user onboarding for platform services and tools. Collaboration & Delivery: Partner with engineering, product, and analytics teams to support platform initiatives and enhance system integration. Collaborate on continuous improvement initiatives to support analytics-ready data delivery across domains and markets. Skills: 5+ years of experience in platform engineering, data infrastructure, or API/data transfer operations. Deep expertise in: Data orchestration and batch processing tools (e.g., Airflow, Dataflow, Composer preferred). Secure data transfer protocols and integration patterns (e.g., SFTP, APIs, event-driven transfers). SQL for platform diagnostics and performance analytics. Python for automation, scripting, and platform utilities. Hands-on experience with cloud-native environments (preferably GCP or AWS), infrastructure-as-code (e.g., Terraform, Ansible), and DevOps practices. Strong problem-solving and analytical skills with a focus on operational excellence. Demonstrated ability to collaborate across cross-functional technical teams. Strong communication and stakeholder engagement skills. Bachelor’s degree in Computer Science, Information Systems, or a related technical field. GCP/AWS certification is preferred Work location: Hyderabad, India Work pattern: Full time role. Work mode: Hybrid. Additional Information: McDonald’s is committed to providing qualified individuals with disabilities with reasonable accommodations to perform the essential functions of their jobs. McDonald’s provides equal employment opportunities to all employees and applicants for employment and prohibits discrimination and harassment of any type without regard to sex, sex stereotyping, pregnancy (including pregnancy, childbirth, and medical conditions related to pregnancy, childbirth, or breastfeeding), race, color, religion, ancestry or national origin, age, disability status, medical condition, marital status, sexual orientation, gender, gender identity, gender expression, transgender status, protected military or veteran status, citizenship status, genetic information, or any other characteristic protected by federal, state or local laws. This policy applies to all terms and conditions of employment, including recruiting, hiring, placement, promotion, termination, layoff, recall, transfer, leaves of absence, compensation and training. McDonald’s Capability Center India Private Limited (“McDonald’s in India”) is a proud equal opportunity employer and is committed to hiring a diverse workforce and sustaining an inclusive culture. At McDonald’s in India, employment decisions are based on merit, job requirements, and business needs, and all qualified candidates are considered for employment. McDonald’s in India does not discriminate based on race, religion, color, age, gender, marital status, nationality, ethnic origin, sexual orientation, political affiliation, veteran status, disability status, medical history, parental status, genetic information, or any other basis protected under state or local laws. Nothing in this job posting or description should be construed as an offer or guarantee of employment. Show more Show less

Posted 3 weeks ago

Apply

4.0 - 5.0 years

0 Lacs

Vadodara, Gujarat, India

On-site

Linkedin logo

About Sun Pharma: Sun Pharmaceutical Industries Ltd. (Sun Pharma) is the fourth largest specialty generic pharmaceutical company in the world with global revenues of US$ 5.4 billion. Supported by 43 manufacturing facilities, we provide high-quality, affordable medicines, trusted by healthcare professionals and patients, to more than 100 countries across the globe. Job Summary EDMS Development and Configuration specialist will be responsible for the successful development, deployment, configuration, and ongoing support of EDMS 21.2. This role requires a deep understanding of EDMS LSQM workflows, strong technical skills, and the ability to work closely with cross-functional teams to ensure the EDMS meets the needs of the organization. Roles and Responsibilities • Assist in the development and maintenance of Documentum D2 LSQM application, including custom workflows and document management solutions. • Collaborate with senior developers to understand requirements and translate them into technical specifications. • Support the testing and debugging of Documentum applications to ensure high-quality output and performance. • Document development processes and maintain accurate technical documentation. • Solid understanding of content management principles and best practices, with experience in implementing Documentum solutions in enterprise environments. • Familiarity with Java, SQL, and web services integration for developing Documentum applications. • Expertise in Documentum platform and its components, including Documentum Content Server and Documentum Webtop. • Proficiency in using development tools such as Documentum Composer and Documentum Administrator. • Experience with version control systems (e.g., Git) and agile development methodologies. Qualifications and Preferences Qualifications: • Bachelor's degree in Information Technology, or a related field. • Minimum of 4-5 years of experience in EDMS LSQM configuration, preferably in a pharmaceutical or biotech environment. • Strong understanding of Category 1, Category 2 & 3 workflows. • Proficiency in Documentum LSQM software. • Ability to manage multiple tasks and projects simultaneously. • Strong analytical and problem-solving skills. • Excellent communication and interpersonal skills. Prefereed Qualifications: • Advanced degree in Information Technology or a related field. • Experience with database management and DQL. • Understanding of Documentum Content Server and its APIs. • Familiarity with Documentum DQL (Documentum Query Language). • Experience in Documentum development, including proficiency in Documentum Foundation Classes (DFC) and Documentum Query Language (DQL). • Basic knowledge of RESTful services and web development principles. Selection Process: Interested Candidates are mandatorily required to apply through the listing on Jigya. Only applications received through Jigya will be evaluated further. Shortlisted candidates may need to appear in an Online Assessment and/or a Technical Screening interview administered by Jigya, on behalf on Sun Pharma Candidates selected after the screening rounds will be processed further by Sun Pharma Show more Show less

Posted 3 weeks ago

Apply

7.0 years

0 Lacs

Chennai, Tamil Nadu, India

On-site

Linkedin logo

Qualification 7-10 Years of overall IT experience with minimum of 6+ years of experience in big data technologies and build processes supporting data transformation, data structures, metadata, dependency and workload management. Graduate in Computer Science, Informatics, Information Systems or other quantitative field Data Engineering Certification from GCP/Cloudera/Horton works Job Description Identify, design, and implement internal process improvements: automating manual processes, optimizing data delivery, re-designing infrastructure for greater scalability, etc. Lead a team of Data Engineers with work load management and mentor the team on technical challenges Problem solver with excellent interpersonal skills with ability to make sound complex decisions in a fast-paced, technical environment Support to build the infrastructure required for optimal extraction, transformation, and loading of data from a wide variety of data sources using SQL and GCP ‘big data’ technologies Experience performing root cause analysis on internal and external data and processes to answer specific business questions and identify opportunities for improvement. Create data engineering tools for faster data transformation and data lake loading Identify the challenges faced in DLL and propose & Implement solution Skills/Tools/Techniques Hands on proficiency with big data tools: Hadoop, Hive, Spark & Scala Experience in GCP cloud services: Google cloud storage, Big query, Spanner, Cloud Pub/Sub, stackdriver & Composer Experience in CI/CD using GitlabEE Experience with data pipeline and workflow management tools: Oozie, Airflow, etc. Show more Show less

Posted 3 weeks ago

Apply

0 years

0 Lacs

Kolkata, West Bengal, India

On-site

Linkedin logo

Job Description: Customize and configure Oracle Fusion modules as per business requirements. Develop and modify reports (BIP, OTBI, FRS, Hyperion Smart View), interfaces, extensions (Page Composer, Application Composer (With Groovy Scripting), Process Composer), and workflows (Oracle BPM, AMX), Forms (ADF (Java Based)), VBCS and Page Customization to enhance functionality. Integrate Oracle Fusion applications with other business systems and third-party applications. (Oracle Integration Cloud) Show more Show less

Posted 3 weeks ago

Apply

5.0 years

0 Lacs

Pune, Maharashtra, India

On-site

Linkedin logo

Equifax is where you can power your possible. If you want to achieve your true potential, chart new paths, develop new skills, collaborate with bright minds, and make a meaningful impact, we want to hear from you. Equifax is seeking creative, high-energy and driven software engineers with hands-on development skills to work on a variety of meaningful projects. Our software engineering positions provide you the opportunity to join a team of talented engineers working with leading-edge technology. You are ideal for this position if you are a forward-thinking, committed, and enthusiastic software engineer who is passionate about technology. What You’ll Do Design, develop, and operate high scale applications across the full engineering stack Design, develop, test, deploy, maintain, and improve software. Apply modern software development practices (serverless computing, microservices architecture, CI/CD, infrastructure-as-code, etc.) Work across teams to integrate our systems with existing internal systems, Data Fabric, CSA Toolset. Participate in technology roadmap and architecture discussions to turn business requirements and vision into reality. Participate in a tight-knit, globally distributed engineering team. Triage product or system issues and debug/track/resolve by analyzing the sources of issues and the impact on network, or service operations and quality. Manage sole project priorities, deadlines, and deliverables. Research, create, and develop software applications to extend and improve on Equifax Solutions Collaborate on scalability issues involving access to data and information. Actively participate in Sprint planning, Sprint Retrospectives, and other team activity What Experience You Need Bachelor's degree or equivalent experience 5+ years of software engineering experience 5+ years experience writing, debugging, and troubleshooting code in mainstream Java, SpringBoot, TypeScript/JavaScript, HTML, CSS 5+ years experience with Cloud technology: GCP, AWS, or Azure 5+ years experience designing and developing cloud-native solutions 5+ years experience designing and developing microservices using Java, SpringBoot, GCP SDKs, GKE/Kubernetes 5+ years experience deploying and releasing software using Jenkins CI/CD pipelines, understand infrastructure-as-code concepts, Helm Charts, and Terraform constructs What could set you apart Self-starter that identifies/responds to priority shifts with minimal supervision. Experience designing and developing big data processing solutions using Dataflow/Apache Beam, Bigtable, BigQuery, PubSub, GCS, Composer/Airflow, and others UI development (e.g. HTML, JavaScript, Angular and Bootstrap) Experience with backend technologies such as JAVA/J2EE, SpringBoot, SOA and Microservices Source code control management systems (e.g. SVN/Git, Github) and build tools like Maven & Gradle. Agile environments (e.g. Scrum, XP) Relational databases (e.g. SQL Server, MySQL) Atlassian tooling (e.g. JIRA, Confluence, and Github) Developing with modern JDK (v1.7+) Automated Testing: JUnit, Selenium, LoadRunner, SoapUI We offer a hybrid work setting, comprehensive compensation and healthcare packages, attractive paid time off, and organizational growth potential through our online learning platform with guided career tracks. Are you ready to power your possible? Apply today, and get started on a path toward an exciting new career at Equifax, where you can make a difference! Who is Equifax? At Equifax, we believe knowledge drives progress. As a global data, analytics and technology company, we play an essential role in the global economy by helping employers, employees, financial institutions and government agencies make critical decisions with greater confidence. We work to help create seamless and positive experiences during life’s pivotal moments: applying for jobs or a mortgage, financing an education or buying a car. Our impact is real and to accomplish our goals we focus on nurturing our people for career advancement and their learning and development, supporting our next generation of leaders, maintaining an inclusive and diverse work environment, and regularly engaging and recognizing our employees. Regardless of location or role, the individual and collective work of our employees makes a difference and we are looking for talented team players to join us as we help people live their financial best. Equifax is an Equal Opportunity Employer. All qualified applicants will receive consideration for employment without regard to race, color, religion, sex, sexual orientation, gender identity, national origin, disability, or status as a protected veteran. Show more Show less

Posted 3 weeks ago

Apply

5.0 years

0 Lacs

India

Remote

Linkedin logo

Job Title - GCP Administrator Location - Remote (Hybrid for Chennai& Mumbai) Experience - 5 to 8 years We are looking for an experienced GCP Administrator to join our team. The ideal candidate will have strong hands-on experience with IAM Administration, multi-account management, Big Query administration, performance optimization, monitoring and cost management within Google Cloud Platform (GCP). Responsibilities: ● Manages and configures roles/permissions in GCP IAM by following the principle of least privileged access ● Manages Big Query service by way of optimizing slot assignments and SQL Queries, adopting FinOps practices for cost control, troubleshooting and resolution of critical data queries, etc. ● Collaborate with teams like Data Engineering, Data Warehousing, Cloud Platform Engineering, SRE, etc. for efficient Data management and operational practices in GCP ● Create automations and monitoring mechanisms for GCP Data-related services, processes and tasks ● Work with development teams to design the GCP-specific cloud architecture ● Provisioning and de-provisioning GCP accounts and resources for internal projects. ● Managing, and operating multiple GCP subscriptions ● Keep technical documentation up to date ● Proactively being up to date on GCP announcements, services and developments. Requirements: ● Must have 5+ years of work experience on provisioning, operating, and maintaining systems in GCP ● Must have a valid certification of either GCP Associate Cloud Engineer or GCP Professional Cloud Architect. ● Must have hands-on experience on GCP services such as Identity and Access Management (IAM), Big Query, Google Kubernetes Engine (GKE), etc. ● Must be capable to provide support and guidance on GCP operations and services depending upon enterprise needs ● Must have a working knowledge of docker containers and Kubernetes. ● Must have strong communication skills and the ability to work both independently and in a collaborative environment. ● Fast learner, Achiever, sets high personal goals ● Must be able to work on multiple projects and consistently meet project deadlines ● Must be willing to work on shift-basis based on project requirements. Good to Have: ● Experience in Terraform Automation over GCP Infrastructure provisioning ● Experience in Cloud Composer, Dataproc, Dataflow Storage and Monitoring services ● Experience in building and supporting any form of data pipeline. ● Multi-Cloud experience with AWS. ● New-Relic monitoring. Perks: ● Day off on the 3rd Friday of every month (one long weekend each month) ● Monthly Wellness Reimbursement Program to promote health well-being ● Paid paternity and maternity leaves Notice period: Immediate to 30 days Email to: poniswarya.m@aptita.com Show more Show less

Posted 3 weeks ago

Apply

7.0 years

0 Lacs

Bangalore Urban, Karnataka, India

On-site

Linkedin logo

Greetings from TCS!! TCS is Hiring for Data Analyst Interview Mode: Virtual Required Experience: 7-18 years Work location: PAN India Strong knowledge of: Data processing software and strategies Big Data, information retrieval, data mining SQL 4+ years of experience with cloud platforms and customer facing projects. Strong ability to successfully interface (verbal and written) with clients in a concise manner while managing expectations at both executive and technical levels. General Data Platform & Data Lakes Relational & Non-Relation Databases Streaming and Batch Pipelines SQL Engines. Possible options: MySQL SQL Server PostgreSQL NoSQL Engines. Possible options: MongoDB Cassandra HBase Dynamo Redis Google Cloud Data Services Cloud SQL BigQuery Dataflow Dataproc Bigtable Composer Cloud Functions Python Hadoop Ecosystem / Apache Softwares Spark Beam Hive Airflow Sqoop Oozie Code Repositories / CICD tools. Possible options: Github Cloud Source Repositories GitLab Azure DevOps If interested kindly send your updated CV and below mentioned details through DM/E-mail: srishti.g2@tcs.com Name: E-mail ID: Contact Number: Highest qualification: Preferred Location: Highest qualification university: Current organization: Total, years of experience: Relevant years of experience: Any gap: Mention-No: of months/years (career/ education): If any then reason for gap: Is it rebegin: Previous organization name: Current CTC: Expected CTC: Notice Period: Have you worked with TCS before (Permanent / Contract ) : Show more Show less

Posted 3 weeks ago

Apply

4.0 years

0 Lacs

Hyderabad, Telangana, India

Remote

Linkedin logo

Title#1: GCP DATA ENGINEER Job role : GCP DATA ENGINEER Required Exp : 4+ years Employment : Permanent - RandomTrees - https://www.randomtrees.com Mode of Work : Hyd-Hybrid/Remote Notice period : Max 15-30 days/Serving Notice to 30 days max. Job Summary: We are seeking experience with 3+ years’ experience as a software engineer – or equivalent – designing large data-heavy distributed systems and/or high-traffic web-apps. Primary / Expected skill set : GCP DE, Big query, Dataflow, PySpark, GCS, Airflow/composer. Key Requirements: · Hands-on experience designing & managing large data models, writing performant SQL queries, and working with large datasets and related technologies. · Experience working with cloud platforms such as GCP, Big Query. · Strong analytical, problem solving and interpersonal skills, have a hunger to learn, and the ability to operate in a self-guided manner in a fast-paced rapidly changing environment · Must have: Experience in pipeline orchestration (e.g. Airflow) · Must have: Good hands-on experience on Dataflow (Python or Java) and Pyspark · Hands-on experience on migration is an added advantage. Project#2 -- UMG Title#2: GCP DATA ENGINEER AIRFLOW Job role : GCP DATA ENGINEER Required Exp : 4+ years Employment : Permanent - RandomTrees - https://www.randomtrees.com Mode of Work : Hyd-Hybrid/Remote Notice period : Max 15-30 days/Serving Notice to 30 days max. Job Description (Airflow): We are seeking experience with 3+ years’ experience as a software engineer – or equivalent – designing large data-heavy distributed systems and/or high-traffic web-apps. Primary Skills: GCP, Python CODING MUST, SQL Coding skills, Big Query, Airflow and Airflow Dag's. Requirements: · Hands-on experience designing & managing large data models, writing performant SQL queries, and working with large datasets and related technologies. · Experience working with cloud platforms such as GCP, Big Query. · Strong analytical, problem solving and interpersonal skills, have a hunger to learn, and the ability to operate in a self-guided manner in a fast-paced rapidly changing environment · Must have: Experience in pipeline orchestration (e.g. Airflow). Show more Show less

Posted 3 weeks ago

Apply

5.0 years

0 Lacs

Trivandrum, Kerala, India

On-site

Linkedin logo

Equifax is seeking creative, high-energy and driven software engineers with hands-on development skills to work on a variety of meaningful projects. Our software engineering positions provide you the opportunity to join a team of talented engineers working with leading-edge technology. You are ideal for this position if you are a forward-thinking, committed, and enthusiastic software engineer who is passionate about technology. What You’ll Do Design, develop, and operate high scale applications across the full engineering stack Design, develop, test, deploy, maintain, and improve software. Apply modern software development practices (serverless computing, microservices architecture, CI/CD, infrastructure-as-code, etc.) Work across teams to integrate our systems with existing internal systems, Data Fabric, CSA Toolset. Participate in technology roadmap and architecture discussions to turn business requirements and vision into reality. Participate in a tight-knit, globally distributed engineering team. Triage product or system issues and debug/track/resolve by analyzing the sources of issues and the impact on network, or service operations and quality. Manage sole project priorities, deadlines, and deliverables. Research, create, and develop software applications to extend and improve on Equifax Solutions Collaborate on scalability issues involving access to data and information. Actively participate in Sprint planning, Sprint Retrospectives, and other team activity What Experience You Need Bachelor's degree or equivalent experience 5+ years of software engineering experience 5+ years experience writing, debugging, and troubleshooting code in mainstream Java, SpringBoot, TypeScript/JavaScript, HTML, CSS 5+ years experience with Cloud technology: GCP, AWS, or Azure 5+ years experience designing and developing cloud-native solutions 5+ years experience designing and developing microservices using Java, SpringBoot, GCP SDKs, GKE/Kubernetes 5+ years experience deploying and releasing software using Jenkins CI/CD pipelines, understand infrastructure-as-code concepts, Helm Charts, and Terraform constructs What could set you apart Self-starter that identifies/responds to priority shifts with minimal supervision. Experience designing and developing big data processing solutions using Dataflow/Apache Beam, Bigtable, BigQuery, PubSub, GCS, Composer/Airflow, and others UI development (e.g. HTML, JavaScript, Angular and Bootstrap) Experience with backend technologies such as JAVA/J2EE, SpringBoot, SOA and Microservices Source code control management systems (e.g. SVN/Git, Github) and build tools . Agile environments (e.g. Scrum, XP) Relational databases Atlassian tooling (e.g. JIRA, Confluence, and Github) Developing with modern JDK (v1.7+) Show more Show less

Posted 3 weeks ago

Apply

5.0 years

0 Lacs

Pune, Maharashtra, India

On-site

Linkedin logo

Equifax is seeking creative, high-energy and driven software engineers with hands-on development skills to work on a variety of meaningful projects. Our software engineering positions provide you the opportunity to join a team of talented engineers working with leading-edge technology. You are ideal for this position if you are a forward-thinking, committed, and enthusiastic software engineer who is passionate about technology. What You’ll Do Design, develop, and operate high scale applications across the full engineering stack Design, develop, test, deploy, maintain, and improve software. Apply modern software development practices (serverless computing, microservices architecture, CI/CD, infrastructure-as-code, etc.) Work across teams to integrate our systems with existing internal systems, Data Fabric, CSA Toolset. Participate in technology roadmap and architecture discussions to turn business requirements and vision into reality. Participate in a tight-knit, globally distributed engineering team. Triage product or system issues and debug/track/resolve by analyzing the sources of issues and the impact on network, or service operations and quality. Manage sole project priorities, deadlines, and deliverables. Research, create, and develop software applications to extend and improve on Equifax Solutions Collaborate on scalability issues involving access to data and information. Actively participate in Sprint planning, Sprint Retrospectives, and other team activity What Experience You Need Bachelor's degree or equivalent experience 5+ years of software engineering experience 5+ years experience writing, debugging, and troubleshooting code in mainstream Java, SpringBoot, TypeScript/JavaScript, HTML, CSS 5+ years experience with Cloud technology: GCP, AWS, or Azure 5+ years experience designing and developing cloud-native solutions 5+ years experience designing and developing microservices using Java, SpringBoot, GCP SDKs, GKE/Kubernetes 5+ years experience deploying and releasing software using Jenkins CI/CD pipelines, understand infrastructure-as-code concepts, Helm Charts, and Terraform constructs What could set you apart Self-starter that identifies/responds to priority shifts with minimal supervision. Experience designing and developing big data processing solutions using Dataflow/Apache Beam, Bigtable, BigQuery, PubSub, GCS, Composer/Airflow, and others UI development (e.g. HTML, JavaScript, Angular and Bootstrap) Experience with backend technologies such as JAVA/J2EE, SpringBoot, SOA and Microservices Source code control management systems (e.g. SVN/Git, Github) and build tools like Maven & Gradle. Agile environments (e.g. Scrum, XP) Relational databases (e.g. SQL Server, MySQL) Atlassian tooling (e.g. JIRA, Confluence, and Github) Developing with modern JDK (v1.7+) Automated Testing: JUnit, Selenium, LoadRunner, SoapUI Show more Show less

Posted 3 weeks ago

Apply

5.0 years

0 Lacs

Kolkata, West Bengal, India

On-site

Linkedin logo

Introduction In this role, you'll work in one of our IBM Consulting Client Innovation Centres (Delivery Centres), where we deliver deep technical and industry expertise to a wide range of public and private sector clients around the world. Our delivery centres offer our clients locally based skills and technical expertise to drive innovation and adoption of new technology Your Role And Responsibilities A career in IBM Consulting is rooted by long-term relationships and close collaboration with clients across the globe. You'll work with visionaries across multiple industries to improve the hybrid cloud and Al journey for the most innovative and valuable companies in the world. Your ability to accelerate impact and make meaningful change for your clients is enabled by our strategic partner ecosystem and our robust technology platforms across the IBM portfolio; including Software and Red Hat. In Your Role, You Will Be Responsible For Skilled Multiple GCP services - GCS, BigQuery, Cloud SQL, Dataflow, Pub/Sub, Cloud Run, Workflow, Composer, Error reporting, Log explorer etc. Must have Python and SQL work experience & Proactive, collaborative and ability to respond to critical situation Ability to analyse data for functional business requirements & front face customer Preferred Education Master's Degree Required Technical And Professional Expertise 5 to 7 years of relevant experience working as technical analyst with Big Query on GCP platform. Skilled in multiple GCP services - GCS, Cloud SQL, Dataflow, Pub/Sub, Cloud Run, Workflow, Composer, Error reporting, Log explorer Ambitious individual who can work under their own direction towards agreed targets/goals and with creative approach to work You love collaborative environments that use agile methodologies to encourage creative design thinking and find innovative ways to develop with cutting edge technologies. End to End functional knowledge of the data pipeline/transformation implementation that the candidate has done, should understand the purpose/KPIs for which data transformation was done Preferred Technical And Professional Experience Experience with AEM Core Technologies : OSGI Services, Apache Sling ,Granite Framework., Java Content Repository API, Java 8+, Localization Familiarity with building tools, Jenkin and Maven , Knowledge of version control tools, especially Git, Knowledge of Patterns and Good Practices to design and develop quality and clean code, Knowledge of HTML, CSS, and JavaScript , jQuery Familiarity with task management, bug tracking, and collaboration tools like JIRA and Confluence Show more Show less

Posted 3 weeks ago

Apply

3.0 years

0 Lacs

Bengaluru, Karnataka, India

On-site

Linkedin logo

Software Engineer II, SMART Candidates for this position are preferred to be based in Bangalore, India and will be expected to comply with their team's hybrid work schedule requirements. Who we are Wayfair runs the largest custom e-commerce large parcel network in the United States, approximately 1.6 million square meters of logistics space. The nature of the network is inherently a highly variable ecosystem that requires flexible, reliable, and resilient systems to operate efficiently. We are looking for a passionate Backend Software Engineer to join the Fulfilment Optimisation team. This team builds the platforms that determine how customer orders are fulfilled, optimising for Wayfair profitability and customer delight. A big part of our work revolves around enhancing and scaling customer-facing platforms that provide fulfilment information on our websites, starting at the top of the customer funnel on the search pages all the way through orders being delivered. Throughout this customer journey, we are responsible for maintaining an accurate representation of our dynamic supply chain, determining how different products will fit into boxes, predicting how these boxes will flow through warehouses and trucks, and ultimately surfacing the information our customers need to inform their decision and the details our suppliers and carriers require to successfully execute on the promises made to our customers. We do all of this in milliseconds, thousands of times per second. The Growth & Ad Platforms Engineering team is responsible for building Wayfair's class leading, data centric technology platform that optimizes and powers all of Wayfair's paid advertising. It touches hundreds of millions of consumers and drives billions of dollars in annual revenue. Our platforms allow us to scale our marketing efforts efficiently by automating key processes, interacting with ad vendors in real time and leveraging hundreds of terabytes of data and ML algorithms for optimisation and effective decision making on millions of items in our catalog.The types of systems built by this team range from low latency REST APIs, to real time streaming systems to high performance large scale batch systems. What you will do Work with a broader highly collaborative cross-functional team that includes product managers, data scientists, and analysts. Work with a variety of technologies, including Java, Spark, Kafka, Aerospike, Hadoop, Airflow, RESTful web services, gRPC, Kubernetes. Additionally, you’ll work with various managed GCP offerings like BigQuery, Composer and Vertex AI. Build platforms and services that allow us to make realtime ML powered decisions that improve the customer’s onsite search experience. Deliver direct measurable results for our business and customers through improved product recommendations. Mentor junior engineers to develop the next generation of Wayfair engineering. Provide high quality code and technical design reviews. Contribute to the code base, with a mind to best practices and an equally high degree of autonomy. What you will need A Bachelor’s Degree in Computer Science, Data Science, or a related engineering discipline. 3+ years of experience in a modern programming language, preferably Java Knowledge of scalable distributed systems with deep understanding of object oriented design and design patterns. Knowledge of designing APIs and microservices. Experience working on cloud technologies specifically GCP is a plus. Experience with Lucene-based Search engines like Elasticsearch is a plus. Knowledge of recommendation systems and productionalizing ML models is a plus. Experience using Kubernetes, Docker, Buildkite, and Terraform for containerization and CI/CD is a plus. Excellent communication skills and ability to work effectively with engineers, product managers, data scientists, analysts and business stakeholders. About Wayfair Inc. Wayfair is one of the world’s largest online destinations for the home. Whether you work in our global headquarters in Boston, or in our warehouses or offices throughout the world, we’re reinventing the way people shop for their homes. Through our commitment to industry-leading technology and creative problem-solving, we are confident that Wayfair will be home to the most rewarding work of your career. If you’re looking for rapid growth, constant learning, and dynamic challenges, then you’ll find that amazing career opportunities are knocking. No matter who you are, Wayfair is a place you can call home. We’re a community of innovators, risk-takers, and trailblazers who celebrate our differences, and know that our unique perspectives make us stronger, smarter, and well-positioned for success. We value and rely on the collective voices of our employees, customers, community, and suppliers to help guide us as we build a better Wayfair – and world – for all. Every voice, every perspective matters. That’s why we’re proud to be an equal opportunity employer. We do not discriminate on the basis of race, color, ethnicity, ancestry, religion, sex, national origin, sexual orientation, age, citizenship status, marital status, disability, gender identity, gender expression, veteran status, genetic information, or any other legally protected characteristic. Your personal data is processed in accordance with our Candidate Privacy Notice ( https://www.wayfair.com/careers/privacy ). If you have any questions or wish to exercise your rights under applicable privacy and data protection laws, please contact us at dataprotectionofficer@wayfair.com . Show more Show less

Posted 3 weeks ago

Apply

0 years

0 Lacs

Bengaluru, Karnataka, India

On-site

Linkedin logo

Line of Service Advisory Industry/Sector Not Applicable Specialism Data, Analytics & AI Management Level Senior Associate Job Description & Summary At PwC, our people in data and analytics focus on leveraging data to drive insights and make informed business decisions. They utilise advanced analytics techniques to help clients optimise their operations and achieve their strategic goals. In business intelligence at PwC, you will focus on leveraging data and analytics to provide strategic insights and drive informed decision-making for clients. You will develop and implement innovative solutions to optimise business performance and enhance competitive advantage. Why PWC At PwC, you will be part of a vibrant community of solvers that leads with trust and creates distinctive outcomes for our clients and communities. This purpose-led and values-driven work, powered by technology in an environment that drives innovation, will enable you to make a tangible impact in the real world. We reward your contributions, support your wellbeing, and offer inclusive benefits, flexibility programmes and mentorship that will help you thrive in work and life. Together, we grow, learn, care, collaborate, and create a future of infinite experiences for each other. Learn more about us. At PwC, we believe in providing equal employment opportunities, without any discrimination on the grounds of gender, ethnic background, age, disability, marital status, sexual orientation, pregnancy, gender identity or expression, religion or other beliefs, perceived differences and status protected by law. We strive to create an environment where each one of our people can bring their true selves and contribute to their personal growth and the firm’s growth. To enable this, we have zero tolerance for any discrimination and harassment based on the above considerations. " Responsibilities Strong skills in Python and GCP services including Google Composer, Bigquery, Google Storage) Strong expertise in writing SQL, PL-SQL in Oracle, MYSQL or any other relation database. Good to have skill: Data warehousing & ETL (any tool) Proven experience in using GCP services is preferred. Strong Presentation and communication skills Analytical & Problem-Solving skills Mandatory Skill Sets GCP Data Engineer Preferred Skill Sets GCP Data Engineer Years Of Experience Required 4-8 Education Qualification Btech/MBA/MCA Education (if blank, degree and/or field of study not specified) Degrees/Field of Study required: Master of Business Administration, Bachelor of Engineering, Bachelor of Technology Degrees/Field Of Study Preferred Certifications (if blank, certifications not specified) Required Skills GCP Dataflow, Good Clinical Practice (GCP) Optional Skills Accepting Feedback, Accepting Feedback, Active Listening, Analytical Thinking, Business Case Development, Business Data Analytics, Business Intelligence and Reporting Tools (BIRT), Business Intelligence Development Studio, Communication, Competitive Advantage, Continuous Process Improvement, Creativity, Data Analysis and Interpretation, Data Architecture, Database Management System (DBMS), Data Collection, Data Pipeline, Data Quality, Data Science, Data Visualization, Embracing Change, Emotional Regulation, Empathy, Inclusion, Industry Trend Analysis {+ 12 more} Desired Languages (If blank, desired languages not specified) Travel Requirements Not Specified Available for Work Visa Sponsorship? No Government Clearance Required? No Job Posting End Date Show more Show less

Posted 3 weeks ago

Apply

5.0 years

0 Lacs

Trivandrum, Kerala, India

On-site

Linkedin logo

Equifax is where you can power your possible. If you want to achieve your true potential, chart new paths, develop new skills, collaborate with bright minds, and make a meaningful impact, we want to hear from you. Equifax is seeking creative, high-energy and driven software engineers with hands-on development skills to work on a variety of meaningful projects. Our software engineering positions provide you the opportunity to join a team of talented engineers working with leading-edge technology. You are ideal for this position if you are a forward-thinking, committed, and enthusiastic software engineer who is passionate about technology. What You’ll Do Design, develop, and operate high scale applications across the full engineering stack Design, develop, test, deploy, maintain, and improve software. Apply modern software development practices (serverless computing, microservices architecture, CI/CD, infrastructure-as-code, etc.) Work across teams to integrate our systems with existing internal systems, Data Fabric, CSA Toolset. Participate in technology roadmap and architecture discussions to turn business requirements and vision into reality. Participate in a tight-knit, globally distributed engineering team. Triage product or system issues and debug/track/resolve by analyzing the sources of issues and the impact on network, or service operations and quality. Manage sole project priorities, deadlines, and deliverables. Research, create, and develop software applications to extend and improve on Equifax Solutions Collaborate on scalability issues involving access to data and information. Actively participate in Sprint planning, Sprint Retrospectives, and other team activity What Experience You Need Bachelor's degree or equivalent experience 5+ years of software engineering experience 5+ years experience writing, debugging, and troubleshooting code in mainstream Java, SpringBoot, TypeScript/JavaScript, HTML, CSS 5+ years experience with Cloud technology: GCP, AWS, or Azure 5+ years experience designing and developing cloud-native solutions 5+ years experience designing and developing microservices using Java, SpringBoot, GCP SDKs, GKE/Kubernetes 5+ years experience deploying and releasing software using Jenkins CI/CD pipelines, understand infrastructure-as-code concepts, Helm Charts, and Terraform constructs What could set you apart Self-starter that identifies/responds to priority shifts with minimal supervision. Experience designing and developing big data processing solutions using Dataflow/Apache Beam, Bigtable, BigQuery, PubSub, GCS, Composer/Airflow, and others UI development (e.g. HTML, JavaScript, Angular and Bootstrap) Experience with backend technologies such as JAVA/J2EE, SpringBoot, SOA and Microservices Source code control management systems (e.g. SVN/Git, Github) and build tools . Agile environments (e.g. Scrum, XP) Relational databases Atlassian tooling (e.g. JIRA, Confluence, and Github) Developing with modern JDK (v1.7+) We offer a hybrid work setting, comprehensive compensation and healthcare packages, attractive paid time off, and organizational growth potential through our online learning platform with guided career tracks. Are you ready to power your possible? Apply today, and get started on a path toward an exciting new career at Equifax, where you can make a difference! Who is Equifax? At Equifax, we believe knowledge drives progress. As a global data, analytics and technology company, we play an essential role in the global economy by helping employers, employees, financial institutions and government agencies make critical decisions with greater confidence. We work to help create seamless and positive experiences during life’s pivotal moments: applying for jobs or a mortgage, financing an education or buying a car. Our impact is real and to accomplish our goals we focus on nurturing our people for career advancement and their learning and development, supporting our next generation of leaders, maintaining an inclusive and diverse work environment, and regularly engaging and recognizing our employees. Regardless of location or role, the individual and collective work of our employees makes a difference and we are looking for talented team players to join us as we help people live their financial best. Equifax is an Equal Opportunity Employer. All qualified applicants will receive consideration for employment without regard to race, color, religion, sex, sexual orientation, gender identity, national origin, disability, or status as a protected veteran. Show more Show less

Posted 3 weeks ago

Apply

5.0 years

0 Lacs

Pune, Maharashtra, India

On-site

Linkedin logo

Equifax is where you can power your possible. If you want to achieve your true potential, chart new paths, develop new skills, collaborate with bright minds, and make a meaningful impact, we want to hear from you. Equifax is seeking creative, high-energy and driven software engineers with hands-on development skills to work on a variety of meaningful projects. Our software engineering positions provide you the opportunity to join a team of talented engineers working with leading-edge technology. You are ideal for this position if you are a forward-thinking, committed, and enthusiastic software engineer who is passionate about technology. What You’ll Do Design, develop, and operate high scale applications across the full engineering stack Design, develop, test, deploy, maintain, and improve software. Apply modern software development practices (serverless computing, microservices architecture, CI/CD, infrastructure-as-code, etc.) Work across teams to integrate our systems with existing internal systems, Data Fabric, CSA Toolset. Participate in technology roadmap and architecture discussions to turn business requirements and vision into reality. Participate in a tight-knit, globally distributed engineering team. Triage product or system issues and debug/track/resolve by analyzing the sources of issues and the impact on network, or service operations and quality. Manage sole project priorities, deadlines, and deliverables. Research, create, and develop software applications to extend and improve on Equifax Solutions Collaborate on scalability issues involving access to data and information. Actively participate in Sprint planning, Sprint Retrospectives, and other team activity What Experience You Need Bachelor's degree or equivalent experience 5+ years of software engineering experience 5+ years experience writing, debugging, and troubleshooting code in mainstream Java, SpringBoot, TypeScript/JavaScript, HTML, CSS 5+ years experience with Cloud technology: GCP, AWS, or Azure 5+ years experience designing and developing cloud-native solutions 5+ years experience designing and developing microservices using Java, SpringBoot, GCP SDKs, GKE/Kubernetes 5+ years experience deploying and releasing software using Jenkins CI/CD pipelines, understand infrastructure-as-code concepts, Helm Charts, and Terraform constructs What could set you apart Self-starter that identifies/responds to priority shifts with minimal supervision. Experience designing and developing big data processing solutions using Dataflow/Apache Beam, Bigtable, BigQuery, PubSub, GCS, Composer/Airflow, and others UI development (e.g. HTML, JavaScript, Angular and Bootstrap) Experience with backend technologies such as JAVA/J2EE, SpringBoot, SOA and Microservices Source code control management systems (e.g. SVN/Git, Github) and build tools like Maven & Gradle. Agile environments (e.g. Scrum, XP) Relational databases (e.g. SQL Server, MySQL) Atlassian tooling (e.g. JIRA, Confluence, and Github) Developing with modern JDK (v1.7+) Automated Testing: JUnit, Selenium, LoadRunner, SoapUI We offer a hybrid work setting, comprehensive compensation and healthcare packages, attractive paid time off, and organizational growth potential through our online learning platform with guided career tracks. Are you ready to power your possible? Apply today, and get started on a path toward an exciting new career at Equifax, where you can make a difference! Who is Equifax? At Equifax, we believe knowledge drives progress. As a global data, analytics and technology company, we play an essential role in the global economy by helping employers, employees, financial institutions and government agencies make critical decisions with greater confidence. We work to help create seamless and positive experiences during life’s pivotal moments: applying for jobs or a mortgage, financing an education or buying a car. Our impact is real and to accomplish our goals we focus on nurturing our people for career advancement and their learning and development, supporting our next generation of leaders, maintaining an inclusive and diverse work environment, and regularly engaging and recognizing our employees. Regardless of location or role, the individual and collective work of our employees makes a difference and we are looking for talented team players to join us as we help people live their financial best. Equifax is an Equal Opportunity Employer. All qualified applicants will receive consideration for employment without regard to race, color, religion, sex, sexual orientation, gender identity, national origin, disability, or status as a protected veteran. Show more Show less

Posted 3 weeks ago

Apply

5.0 - 10.0 years

0 Lacs

Hyderabad, Telangana, India

On-site

Linkedin logo

Greetings from Tata Consulting Services TCS is Hiring for GCP Data engineer with Python Experience : 5-10 years Location: Chennai/Bangalore/Hyderabad/Pune/Gurgaon Please find the JD below Required Technical Skill - GCP Data engineer, Data flow, Data proc , cloud composer, Python, Cloud SQL Show more Show less

Posted 3 weeks ago

Apply

Exploring Composer Jobs in India

India has a growing market for composer jobs, with various opportunities available for talented individuals in the music industry. Whether it's creating music for films, television, video games, or other media, composers play a vital role in shaping the overall experience for audiences. If you're considering a career in composing, here's a guide to help you navigate the job market in India.

Top Hiring Locations in India

  1. Mumbai
  2. Chennai
  3. Bangalore
  4. Hyderabad
  5. Delhi

These cities are known for their vibrant entertainment industries and often have a high demand for composers across various projects.

Average Salary Range

The average salary range for composer professionals in India can vary depending on experience and expertise. Entry-level composers can expect to earn between INR 3-5 lakhs per year, while experienced composers with a strong portfolio can earn upwards of INR 10 lakhs per year.

Career Path

In the field of composing, a typical career path may involve starting as a Junior Composer, then progressing to a Composer, Senior Composer, and eventually a Music Director or Lead Composer. As you gain more experience and recognition for your work, you may have the opportunity to work on larger projects and collaborate with well-known artists.

Related Skills

In addition to composing skills, it is beneficial for composers to have a good understanding of music theory, proficiency in music production software, excellent communication skills for collaborating with directors and producers, and the ability to work under tight deadlines.

Interview Questions

  • What inspired you to pursue a career in composing? (basic)
  • Can you walk us through your creative process when composing music for a project? (medium)
  • How do you handle feedback and revisions from clients or directors? (medium)
  • Can you discuss a challenging project you worked on and how you overcame obstacles during the composition process? (advanced)
  • How do you stay updated on current trends in the music industry and incorporate them into your work? (medium)
  • Have you ever had to compose music for a project with a tight deadline? How did you manage your time effectively? (medium)
  • Can you provide examples of different genres or styles of music you are comfortable composing? (medium)
  • How do you ensure that your music aligns with the overall vision of a project? (advanced)
  • Have you ever collaborated with other musicians or artists on a composition? How did you approach that collaboration? (medium)
  • What software or tools do you use for composing and producing music? (basic)
  • Can you discuss a piece of music you composed that you are particularly proud of? (medium)
  • How do you handle creative blocks or moments of inspiration? (medium)
  • What is your experience working with live musicians or orchestras for recording sessions? (advanced)
  • How do you approach negotiating fees or contracts for your composition work? (medium)
  • Can you discuss a project where you had to compose music for a specific cultural or historical context? (advanced)
  • How do you ensure that your music is original and does not infringe on copyright laws? (medium)
  • Have you ever had to rework a composition multiple times based on client feedback? How did you handle that situation? (medium)
  • What do you think sets your composing style apart from others in the industry? (medium)
  • How do you approach creating a memorable and impactful musical theme for a project? (medium)
  • Can you discuss a project where you had to compose music for a non-traditional or experimental medium? (advanced)
  • How do you balance artistic integrity with meeting the client's expectations and requirements? (medium)
  • What is your process for creating a soundtrack that enhances the emotional impact of a scene in a film or game? (medium)
  • Can you discuss a time when you had to compose music that evoked a specific mood or atmosphere? (medium)
  • How do you approach collaborating with sound designers or audio engineers to enhance the overall sound of a project? (medium)

Closing Remark

As you prepare for composer roles in India, remember to showcase your unique talents and passion for music in your portfolio and interviews. With dedication and creativity, you can pursue a rewarding career in composing and contribute to the vibrant entertainment industry in India. Good luck with your job search!

cta

Start Your Job Search Today

Browse through a variety of job opportunities tailored to your skills and preferences. Filter by location, experience, salary, and more to find your perfect fit.

Job Application AI Bot

Job Application AI Bot

Apply to 20+ Portals in one click

Download Now

Download the Mobile App

Instantly access job listings, apply easily, and track applications.

Featured Companies