Home
Jobs

463 Composer Jobs - Page 12

Filter Interviews
Min: 0 years
Max: 25 years
Min: ₹0
Max: ₹10000000
Setup a job Alert
Filter
JobPe aggregates results for easy application access, but you actually apply on the job portal directly.

0 years

0 Lacs

Jaipur, Rajasthan, India

On-site

Linkedin logo

Senior Data Analyst : Power BI, GCP, Python & SQL Job Summary We are looking for a Senior Data Analyst with strong expertise in Power BI, Google Cloud Platform (GCP), Python, and SQL to design data models, automate analytics workflows, and deliver business intelligence that drives strategic decisions. The ideal candidate is a problem-solver who can work with complex datasets in the cloud, build intuitive dashboards, and code custom analytics using Python and SQL. Key Responsibilities Develop advanced Power BI dashboards and reports based on structured and semi-structured data from BigQuery and other GCP sources. Write and optimize complex SQL queries (BigQuery SQL) for reporting and data modeling. Use Python to automate data preparation tasks, build reusable analytics scripts, and support ad hoc data requests. Partner with data engineers and stakeholders to define metrics, build ETL pipelines, and create scalable data models. Design and implement star/snowflake schema models and DAX measures in Power BI. Maintain data integrity, monitor performance, and ensure security best practices across all reporting systems. Drive initiatives around data quality, governance, and cost optimization on GCP. Mentor junior analysts and actively contribute to analytics strategy and roadmap. Must-Have Skills Expert-level SQL : Hands-on experience writing complex queries in BigQuery , optimizing joins, window functions, CTEs. Proficiency in Python : Data wrangling, Pandas, NumPy, automation scripts, API consumption, etc. Power BI expertise : Building dashboards, using DAX, Power Query (M), custom visuals, report performance tuning. GCP hands-on experience : Especially with BigQuery, Cloud Storage, and optionally Cloud Composer or Dataflow. Strong understanding of data modeling, ETL pipelines, and analytics workflows. Excellent communication skills and the ability to explain data insights to non-technical audiences. Preferred Qualifications Experience in version control (Git) and working in CI/CD environments. Google Professional Data Engineer PL-300 : Microsoft Power BI Data Analyst Associate (ref:hirist.tech) Show more Show less

Posted 2 weeks ago

Apply

5.0 years

0 Lacs

Hyderabad, Telangana, India

On-site

Linkedin logo

Job title: R&D Data Modeling Manager Associate Location: Hyderabad Sanofi is a global life sciences company committed to improving access to healthcare and supporting the people we serve throughout the continuum of care. From prevention to treatment, Sanofi transforms scientific innovation into healthcare solutions, in human vaccines, rare diseases, multiple sclerosis, oncology, immunology, infectious diseases, diabetes and cardiovascular solutions and consumer healthcare. More than 110,000 people in over 100 countries at Sanofi are dedicated to making a difference in patients’ daily lives, wherever they live and enabling them to enjoy a healthier life. As a company with a global vision of drug development and a highly regarded corporate culture, Sanofi is recognized as one of the best pharmaceutical companies in the world and is pioneering the application of Artificial Intelligence (AI) with a strong commitment to developing advanced data standards to increase reusability & interoperability and thus accelerate impact on global health. The R&D Data Office serves as a cornerstone of this effort. Our team is responsible for cross-R&D data strategy, governance, and management. We partner with Business and Digital and drive data needs across priority and transformative initiatives across R&D. Team members serve as advisors, leaders, and educators to colleagues and data professionals across the R&D value chain. As an integral team member, you will be responsible for defining how R&D's structured, semi-structured and unstructured data will be stored, consumed, integrated / shared and reported by different end users such as scientists, clinicians, and more. You will also be pivotal in developing sustainable mechanisms for ensuring data are FAIR (findable, accessible, interoperable, and reusable). Position Summary The primary responsibility of this position is to support semantic integration and data harmonization across pharmaceutical R&D functions. In this role, you will design and implement ontologies and controlled vocabularies that enable interoperability of scientific, clinical, and operational data. Your work will be critical in accelerating discovery, improving data reuse, and enhancing insights across the drug development lifecycle. Main Responsibilities Develop, maintain, and govern ontologies and semantic models for key pharmaceutical domains, including preclinical, clinical, regulatory, and translational research Design and implement controlled vocabularies and taxonomies to standardize terminology across experimental data, clinical trials, biomarkers, compounds, and regulatory documentation Collaborate with cross-functional teams including chemists, biologists, pharmacologists, data scientists, and IT architects to align semantic models with scientific workflows and data standards Map internal data sources to public ontologies and standards to ensure FAIR (Findable, Accessible, Interoperable, Reusable) data principles Leverage semantic web technologies and ontology tools to build knowledge representation frameworks Participate in ontology alignment, reasoning, and validation processes to ensure quality and logical consistency Document semantic assets, relationships, and governance policies to support internal education and external compliance Deliverables Domain-specific ontologies representing concepts such as drug discovery (e.g., compounds, targets, assays), preclinical and clinical studies, biomarkers, adverse events, pharmacokinetics / dynamics, mechanisms of action, and disease models built using OWL/RDF and aligned with public standards Controlled vocabularies & taxonomies for experimental conditions, cell lines, compound classes, endpoints, clinical trial protocols, etc. Semantic data models supporting the integration of heterogeneous data sources (e.g., lab systems, clinical trial data, external databases) Knowledge graphs or knowledge maps for semantic integration of structured data from internal R&D systems Mappings to public ontologies, standards, and external knowledge bases like: CDISC, MedDRA, LOINC, UMLS, SNOMED CT, RxNorm, UniProt, DrugBank, PubChem, NCBI Ontology documentation & governance artifacts, including ontology scope, design rationale, versioning documentation, and usage guidelines for internal stakeholders Validation reports and consistency checks, including outputs from reasoners or SHACL validation to ensure logical coherence and change impact assessments when modifying existing ontologies Training and stakeholder support materials: slide decks, workshops, and tutorials on using ontologies in data annotation, integration, and search Support for application developers embedding semantic layers About You Experience: 5+ years of experience in ontology engineering, data management, data analysis, data architecture, or another related field Proven experience in ontology engineering, Proven experience in ontology development within the biomedical or pharmaceutical domain Experience working with biomedical ontologies and standards (e.g., GO, BAO, EFO, ChEBI, NCBI Taxonomy, NCI Thesaurus, etc.) Familiarity with controlled vocabulary curation and knowledge graph construction. Demonstrated ability to understand end-to-end data use and business needs Knowledge and/or experience of Pharma R&D or life sciences data and data domains. Understanding of FAIR data principles, data governance, and metadata management Strong analytical problem-solving skills. Demonstrated strong attention to detail, quality, time management and customer focus Excellent written and oral communication skills. Strong networking, influencing, and negotiating skills and superior problem-solving skills Demonstrated willingness to make decisions and to take responsibility for such. Excellent interpersonal skills (team player) Knowledge and experience in ontology engineering and maintenance are required. Knowledge and experience with OWL, RDF, SKOS, and SPARQL Familiarity with ontology engineering tools (e.g., Protégé, CENtree, TopBraid Composer PoolParty), Familiarity with ontology engineering methodologies (e.g., NeOn, METHONTOLOGY, Uschold and King, Grüninger and Fox, etc.) Knowledge and experience in data modeling are highly desired. Experience with pharma R&D platforms, requirements gathering, system design, and validation/quality/compliance requirements Experience with hierarchical data models from conceptualization to implementation, bachelor’s in computer science, Information Science, Knowledge Engineering, or related; Masters or higher preferred Languages: English null Show more Show less

Posted 2 weeks ago

Apply

7.0 years

0 Lacs

Pune, Maharashtra, India

On-site

Linkedin logo

Equifax is seeking creative, high-energy and driven software engineers with hands-on development skills to work on a variety of meaningful projects. Our software engineering positions provide you the opportunity to join a team of talented engineers working with leading-edge technology. You are ideal for this position if you are a forward-thinking, committed, and enthusiastic software engineer who is passionate about technology. What You’ll Do Demonstrate a deep understanding of cloud native, distributed micro service based architectures Deliver solutions for complex business problems through software standard SDLC Build strong relationships with both internal and external stakeholders including product, business and sales partners Demonstrate excellent communication skills with the ability to both simplify complex problems and also dive deeper if needed Build and manage strong technical teams that deliver complex software solutions that scale Manage teams with cross functional skills that include software, quality, reliability engineers, project managers and scrum masters Provide deep troubleshooting skills with the ability to lead and solve production and customer issues under pressure Leverage strong experience in full stack software development and public cloud like GCP and AWS Mentor, coach and develop junior and senior software, quality and reliability engineers Lead with a data/metrics driven mindset with a maniacal focus towards optimizing and creating efficient solutions Ensure compliance with EFX secure software development guidelines and best practices and responsible for meeting and maintaining QE, DevSec, and FinOps KPIs Define, maintain and report SLA, SLO, SLIs meeting EFX engineering standards in partnership with the product, engineering and architecture teams Collaborate with architects, SRE leads and other technical leadership on strategic technical direction, guidelines, and best practices Drive up-to-date technical documentation including support, end user documentation and run books Lead Sprint planning, Sprint Retrospectives, and other team activity Responsible for implementation architecture decision making associated with Product features/stories, refactoring work, and EOSL decisions Create and deliver technical presentations to internal and external technical and non-technical stakeholders communicating with clarity and precision, and present complex information in a concise format that is audience appropriate What Experience You Need Bachelor's degree or equivalent experience 7+ years of software engineering experience 7+ years experience writing, debugging, and troubleshooting code in mainstream Java, SpringBoot, TypeScript/JavaScript, HTML, CSS 7+ years experience with Cloud technology: GCP, AWS, or Azure 7+ years experience designing and developing cloud-native solutions 7+ years experience designing and developing microservices using Java, SpringBoot, GCP SDKs, GKE/Kubernetes 7+ years experience deploying and releasing software using Jenkins CI/CD pipelines, understand infrastructure-as-code concepts, Helm Charts, and Terraform constructs What could set you apart Self-starter that identifies/responds to priority shifts with minimal supervision. Strong communication and presentation skills Strong leadership qualities Demonstrated problem solving skills and the ability to resolve conflicts Experience creating and maintaining product and software roadmaps Experience overseeing yearly as well as product/project budgets Working in a highly regulated environment Experience designing and developing big data processing solutions using Dataflow/Apache Beam, Bigtable, BigQuery, PubSub, GCS, Composer/Airflow, and others UI development (e.g. HTML, JavaScript, Angular and Bootstrap) Experience with backend technologies such as JAVA/J2EE, SpringBoot, SOA and Microservices Source code control management systems (e.g. SVN/Git, Github) and build tools like Maven & Gradle. Agile environments (e.g. Scrum, XP) Relational databases (e.g. SQL Server, MySQL) Atlassian tooling (e.g. JIRA, Confluence, and Github) Developing with modern JDK (v1.7+) Automated Testing: JUnit, Selenium, LoadRunner, SoapUI Show more Show less

Posted 2 weeks ago

Apply

3.0 - 5.0 years

0 Lacs

Bengaluru, Karnataka, India

On-site

Linkedin logo

Job Title - GN - SONG - Service - Sprinklr – Analyst Management Level: 11-Analyst Location: Bengaluru Must-have skills: Sprinklr Good to have skills: Ability to leverage design thinking, business process optimization, and stakeholder management skills. Job Summary: This role involves driving strategic initiatives, managing business transformations, and leveraging industry expertise to create value-driven solutions. Roles & Responsibilities: Provide strategic advisory services, conduct market research, and develop data-driven recommendations to enhance business performance. You will work closely with our clients as consulting professionals who design, build, and implement initiatives that can help enhance business performance. As part of these, you will Use in depth understand: Ability to work on high-paced and complex projects. Apply understanding of industry-specific Customer Service processes: Possess a strong and well-established record of accomplishment in designing and delivering customer interaction solutions across various interaction channels (IVR, web, email, chat, SMS, social media, etc.) Deploy a strong designing skill: Deliver customer interaction solutions across various interaction channels (IVR, web, email, chat, SMS, social media, etc...), quality monitoring, WFM, Gamification, Recording, etc. Should; be able to envision and design AI-powered customer and employee experience enhancements of the future. Bring your best skills forward to excel at the role: Ability to use technical exposure to contact center and overall customer service areas: . Plan, design, implementation, configuration of Sprinklr chosen platform with our clients Act as a subject matter expert on Sprinklr service, providing expertise on CCaaS transformation client projects across the entire delivery lifecycle Possess a deep understanding of Sprinklr Service solution architecture, capabilities, and hands-on configurations to activate those capabilities. Experience with IVR, Outbound Voice, Email, social media, Chat, Video and (a)synchronous messaging services Integration of Sprinklr unified CXM with enterprise systems Work Experience related to CICD tools. Stay current with intensive training and maintain updated Sprinklr certifications. Easily work in high-paced and complex projects: Use understanding of industry specific Customer Service processes, operations, and functional needs Maximize application design and development experience: Implement the orchestration platform of Sprinklr preferably in Fortune 500 companies with sophisticated customer interaction operations, leading Self Service vendor organizations, or leading consulting firms. Good to know Contact Center as a Service (CCaaS) platforms such as AWS Connect, Genesys Cloud, Nuance CXone, and Microsoft Dynamics 365 Customer Service Professional & Technical Skills: Relevant experience in the required domain. Strong analytical, problem-solving, and communication skills. Ability to work in a fast-paced, dynamic environment. Functional and hands on experience on Voice and Non-Voice (SMS, Email, Chat, and social channels.) applications solutioning using Architect/Composer/Interaction Designer Additional Information: Opportunity to work on innovative projects. Career growth and leadership exposure. About Our Company | Accenture Experience: 3 to 5 Years Educational Qualification: B.Com Show more Show less

Posted 2 weeks ago

Apply

8.0 years

6 - 9 Lacs

Hyderābād

On-site

GlassDoor logo

Hyderabad, India Operations In-Office 8731 Job Description Job Purpose ICE Data Services India Private Limited., a subsidiary of Intercontinental Exchange presents a unique opportunity to work with cutting-edge technology and business challenges in the financial sector. ICE team members work across departments and traditional boundaries to innovate and respond to industry demand. A successful candidate will be able to multitask in a dynamic team-based environment demonstrating strong problem-solving and decision-making abilities and the highest degree of professionalism. Engineer, Enterprise Endpoint Solutions is part of the team responsible for the global corporate endpoint computing environment. This position is specifically charged with the management and maintenance of the workstation environment for all ICE, Inc. and subsidiary companies. This position requires strong technical proficiency with a range of enterprise tools, as well as an eager attitude, professionalism, and solid communication skills. Responsibilities Must have Strong experience MAC OS with familiarity with UNIX shell scripting with SH, BASH, and ZSH, or Python. Familiarity with Apple Platform/macOS Mobile Device Management concepts and enterprise deployed application configuration methodologies. Familiarity with JAMF. Experience with JAMF Composer or Visual Studio Code and Munkipkg. Experience in Workspace One (Airwatch) which includes deploying agents, troubleshooting issues, Device management, enrolling devices etc. Good To have Workstation OS – Familiarity with Windows OS. Application Packaging - Experience in terms of building the applications as part of MSI, MST using Flexera Admin Studio or PSADT (PowerShell Deployment Toolkit). Having exposure on packaging applications using tools such MS Intune and JAMF will be an added advantage for this role. Desktop Management Systems - Experience with building and managing desktop management infrastructure using tools such as MECM, MS Intune, Core Management, Autopilot, Azure. Image Management -Experience with Capturing the Image using MDT or PXE methodology and create task sequence and deploy them through SCCM, MECM. Patch Management - Software and OS patch management using SCCM, Cloud Management Gateway (CMG), Windows Evergreen, Windows Update for Business etc. Mobile Device Management: Need experience in terms of managing mobile devices Android, Windows, Apple using varies management tools such as Intune, AirWatch, JAMF etc. Scripting - Strong scripting skills in VBScript, batch and PowerShell required, with advanced knowledge of PowerShell for management and automation a big plus. Should be comfortable performing tasks from the command line. Knowledge and Experience Candidate with more than 8-10+ years of experience in terms of Desktop Management role using varies MACS OS technologies like JAMF, Unix, Python, BASH, ZSH & Microsoft Technologies such as MECM, MS Intune, Azure etc. College degree in Engineering, MIS, CIS, or related discipline preferred. Familiarity in creating test, migration, and delivery plans for delivering software and security patches globally. Familiarity with Service Now and Change/Incident Management practices. Experience in the Financial Services industry a plus. Experience in managing VDI environment (VMware Horizon, Citrix) will be plus.

Posted 2 weeks ago

Apply

4.0 - 6.0 years

6 - 9 Lacs

Hyderābād

On-site

GlassDoor logo

Hyderabad, India Operations In-Office 8741 Job Description Job Purpose ICE Data Services India Private Limited., a subsidiary of Intercontinental Exchange presents a unique opportunity to work with cutting-edge technology and business challenges in the financial sector. ICE team members work across departments and traditional boundaries to innovate and respond to industry demand. A successful candidate will be able to multitask in a dynamic team-based environment demonstrating strong problem-solving and decision-making abilities and the highest degree of professionalism. Engineer, Enterprise Endpoint Solutions is part of the team responsible for the global corporate endpoint computing environment. This position is specifically charged with the management and maintenance of the workstation environment for all ICE, Inc. and subsidiary companies. This position requires strong technical proficiency with a range of enterprise tools, as well as an eager attitude, professionalism, and solid communication skills. Responsibilities Must have Workstation OS – Familiarity with Windows OS. Application Packaging - Experience in terms of building the applications as part of MSI, MST using Flexera Admin Studio or PSADT (PowerShell Deployment Toolkit). Having exposure on packaging applications using tools such MS Intune and JAMF will be an added advantage for this role. Desktop Management Systems - Experience with building and managing desktop management infrastructure using tools such as MECM, MS Intune, Tainum Core Management, Autopilot, Azure. Image Management -Experience with Capturing the Image using MDT or PXE methodology and create task sequence and deploy them through SCCM, MECM. Patch Management - Software and OS patch management using SCCM, Cloud Management Gateway (CMG), Windows Evergreen, Windows Update for Business etc. Mobile Device Management: Need experience in terms of managing mobile devices Android, Windows, Apple using varies management tools such as Intune, AirWatch, JAMF etc. Scripting - Strong scripting skills in VBScript, batch and PowerShell required, with advanced knowledge of PowerShell for management and automation a big plus. Should be comfortable performing tasks from the command line. Good to have Workstation OS - Strong experience MAC OS with familiarity with UNIX shell scripting with SH, BASH, and ZSH, or Python MAC OS Tech - Familiarity with JAMF a plus. Experience with JAMF Composer or Visual Studio Code and Munkipkg Knowledge and Experience Candidate with more than 4-6 years of experience in terms of Desktop Management role using varies MACS OS technologies like JAMF, Unix, Python, BASH, ZSH & Microsoft Technologies such as MECM, MS Intune, Azure etc. College degree in Engineering, MIS, CIS, or related discipline preferred. Familiarity in creating test, migration, and delivery plans for delivering software and security patches globally. Familiarity with Service Now and Change/Incident Management practices. Experience in the Financial Services industry a plus. Experience in managing VDI environment (VMware Horizon, Citrix) will be plus.

Posted 2 weeks ago

Apply

0 years

6 - 9 Lacs

Hyderābād

Remote

GlassDoor logo

Hyderabad, India Chennai, India Job ID: R-1069077 Apply prior to the end date: August 8th, 2025 When you join Verizon You want more out of a career. A place to share your ideas freely — even if they’re daring or different. Where the true you can learn, grow, and thrive. At Verizon, we power and empower how people live, work and play by connecting them to what brings them joy. We do what we love — driving innovation, creativity, and impact in the world. Our V Team is a community of people who anticipate, lead, and believe that listening is where learning begins. In crisis and in celebration, we come together — lifting our communities and building trust in how we show up, everywhere & always. Want in? Join the #VTeamLife. What you’ll be doing… We are looking for data engineers who can work with world class team members to help drive telecom business to its full potential. We are building data products / assets for telecom wireless and wireline business which includes consumer analytics, telecom network performance and service assurance analytics etc. We are working on cutting edge technologies like digital twin to build these analytical platforms and provide data support for varied AI ML implementations. As a data engineer you will be collaborating with business product owners, coaches, industry renowned data scientists and system architects to develop strategic data solutions from sources which includes batch, file and data streams As a Data Engineer with ETL/ELT expertise for our growing data platform & analytics teams, you will understand and enable the required data sets from different sources both structured and unstructured data into our data warehouse and data lake with real-time streaming and/or batch processing to generate insights and perform analytics for business teams within Verizon. Understanding the business requirements and the technical design. Working on Data Ingestion, Preparation and Transformation. Developing data streaming applications. Debugging the production failures and identifying the solution. Working on ETL/ELT development. What we’re looking for... You’re curious about new technologies and the game-changing possibilities it creates. You like to stay up-to-date with the latest trends and apply your technical expertise to solving business problems. #AI&D You’ll need to have… Bachelor’s degree or one or more years of work experience. Experience with Data Warehouse concepts and Data Management life cycle. Experience in any DBMS Experience in Shell scripting, Spark, Scala. Experience in GCP, Cloud Composer and BigQuery Even better if you have one or more of the following Two or more years of relevant experience. Any relevant Certification on ETL/ELT developer. Certification in GCP-Data Engineer. Accuracy and attention to detail. Good problem solving, analytical, and research capabilities. Good verbal and written communication. Experience presenting to and influence stakeholders. Where you’ll be working In this hybrid role, you'll have a defined work location that includes work from home and assigned office days set by your manager. Scheduled Weekly Hours 40 Equal Employment Opportunity Verizon is an equal opportunity employer. We evaluate qualified applicants without regard to race, gender, disability or any other legally protected characteristics. Apply Now Save Saved Open sharing options Share Related Jobs Engr II-Data Engineering Save Chennai, India, +1 other location Technology Engineer III Consultant-Data Engineering Save Hyderabad, India, +2 other locations Technology Engineer III Consultant-Data Engineering Save Hyderabad, India, +1 other location Technology Shaping the future. Connect with the best and brightest to help innovate and operate some of the world’s largest platforms and networks.

Posted 2 weeks ago

Apply

10.0 years

0 Lacs

India

On-site

Linkedin logo

Job Description Are You Ready to Make It Happen at Mondelēz International? Join our Mission to Lead the Future of Snacking. Make It With Pride. Together with analytics team leaders you will support our business with excellent data models to uncover trends that can drive long-term business results. How You Will Contribute You will: Work in close partnership with the business leadership team to execute the analytics agenda Identify and incubate best-in-class external partners to drive delivery on strategic projects Develop custom models/algorithms to uncover signals/patterns and trends to drive long-term business performance Execute the business analytics program agenda using a methodical approach that conveys to stakeholders what business analytics will deliver What You Will Bring A desire to drive your future and accelerate your career and the following experience and knowledge: Using data analysis to make recommendations to senior leaders Technical experience in roles in best-in-class analytics practices Experience deploying new analytical approaches in a complex and highly matrixed organization Savvy in usage of the analytics techniques to create business impacts In this role, you will be a key technical leader in developing our cutting-edge Supply Chain Data Product ecosystem. You'll have the opportunity to design, build, and automate data ingestion, harmonization, and transformation processes, driving advanced analytics, reporting, and insights to optimize Supply Chain performance across the organization. You will play an instrumental part in engineering robust and scalable data solutions, acting as a hands-on expert for Supply Chain data, and contributing to how these data products are visualized and interacted with. What you need to know about this position: Our team is at the forefront of building Data Products driving AI and Analytics across Mondelez. As a Sr. Data & Analytics Engineer within the Mondelez Data Product team, you will be working towards the following objectives: Build and Enhance Supply Chain Data Products: Design, develop, and maintain reusable Data Products as the single source of truth for Supply Chain, logistics, manufacturing, and related data domains. End-to-End Data Pipeline Development: Leverage data from diverse internal systems (especially SAP ECC/S4HANA, SAP BW/HANA, and other Supply Chain platforms) and external data sources, ingesting it into our centralized Google Cloud Platform (GCP) data platform. Enable Data Governance and Management: Implement and champion Data Governance and Data Management standards, including data cataloging, documentation, security protocols, robust data quality controls, master data management (MDM) principles, and data democratization. Scalable Data Modeling & Implementation: Design, implement, and optimize reusable and scalable data models following industry best practices and high coding standards, ensuring efficient data flow for analytical consumption. Hands-on Technical Leadership: Play a lead technical role throughout the entire Software Development Lifecycle (SDLC) – from requirements gathering and design to development, testing, deployment, and hypercare. Product and Value-Driven Mindset: Build each feature with a Product Owner mindset, focusing on delivering business value with efficiency, agility, and a strong bias for action. This includes considering the end-user experience of the data products. Technical Point of Contact: Act as a key technical expert and point of contact within the Data Engineering team for Supply Chain data initiatives. What extra ingredients you will bring: Proven hands-on experience designing and developing complex data models and high-quality, performant data pipelines. Passion for leveraging data to drive tangible improvements in Supply Chain efficiency, cost reduction, and operational excellence. Ability to thrive in an entrepreneurial, fast-paced setting, managing complex data challenges with a solutions-oriented approach. Excellent communication and collaboration skills to facilitate effective teamwork, engage with Supply Chain stakeholders, and explain complex data concepts to both technical and non-technical individuals. Strong problem-solving skills and business acumen, particularly within the Supply Chain domain. Education / Certifications: Bachelor's degree in Information Systems/Technology, Computer Science, Analytics, Engineering, or a related field. 10+ years of hands-on experience in data engineering, data warehousing, or a similar technical role, preferably in CPG or manufacturing with a strong focus on Supply Chain data. Job specific requirements (Hands-on Experience Focus): SAP Data Expertise: Deep hands-on experience in extracting, transforming, and modeling data from SAP ECC/S4HANA (modules like MM, SD, PP, QM, FI/CO) and SAP BW/HANA. Proven ability to understand SAP data structures and business processes within Supply Chain. Data Pipeline Development: Design, build, and maintain robust and efficient ETL/ELT processes for data integration, ensuring data accuracy, integrity, and timeliness. Cloud Data Engineering (GCP Focused): Strong proficiency and hands-on experience in data warehousing solutions and data engineering services within the Google Cloud Platform (GCP) ecosystem (e.g., BigQuery, Dataflow, Dataproc, Cloud Composer, Pub/Sub). Hands-on experience with Databricks (ideally deployed on GCP or with GCP integration) for large-scale data processing, Spark-based transformations, and advanced analytics is highly desirable. Data Modeling & Warehousing: Hands-on experience in developing efficient data models (e.g., dimensional, Data Vault) and building/maintaining data warehouses, primarily on GCP. Programming & Automation: Strong proficiency in SQL and Python for data manipulation, pipeline development, and automation. API Integration: Experience working with multiple APIs to extract and/or send data for system integration. BI & Analytics Enablement: Collaborate with data scientists, analysts, and business users to provide high-quality, reliable data for their analyses and models. Support the development of data consumption layers, including dashboards (e.g., Tableau, Power BI). UI/Front-End Experience (Desirable): Experience with front-end technologies and JavaScript frameworks (e.g., React JS, Angular, Vue.js) for building custom data visualizations, interactive dashboards, or user interfaces for Data Products is a significant plus. System Monitoring & Optimization: Monitor data processing systems and pipelines to ensure efficiency, reliability, performance, and uptime; proactively identify and resolve bottlenecks. DevOps & CI/CD: Prior hands-on work in a DevOps environment or strong understanding and experience implementing CI/CD pipelines for data solutions Data Governance & Security: Ensure compliance with data privacy regulations and implement data security standards and best practices within data solutions on GCP. Industry Knowledge: Solid understanding of the consumer goods industry, particularly Supply Chain processes and relevant key performance indicators (KPIs). Agile Methodologies: Experience working in Agile development environments. Continuous Learning: Stay updated on the latest technologies and best practices in data engineering, GCP data services, Databricks, UI development trends, and database management. Within Country Relocation support available and for candidates voluntarily moving internationally some minimal support is offered through our Volunteer International Transfer Policy Business Unit Summary At Mondelēz International, our purpose is to empower people to snack right by offering the right snack, for the right moment, made the right way. That means delivering a broad range of delicious, high-quality snacks that nourish life's moments, made with sustainable ingredients and packaging that consumers can feel good about. We have a rich portfolio of strong brands globally and locally including many household names such as Oreo , belVita and LU biscuits; Cadbury Dairy Milk , Milka and Toblerone chocolate; Sour Patch Kids candy and Trident gum. We are proud to hold the top position globally in biscuits, chocolate and candy and the second top position in gum. Our 80,000 makers and bakers are located in more than 80 countries and we sell our products in over 150 countries around the world. Our people are energized for growth and critical to us living our purpose and values. We are a diverse community that can make things happen—and happen fast. Mondelēz International is an equal opportunity employer and all qualified applicants will receive consideration for employment without regard to race, color, religion, gender, sexual orientation or preference, gender identity, national origin, disability status, protected veteran status, or any other characteristic protected by law. Job Type Regular Analytics & Modelling Analytics & Data Science Show more Show less

Posted 2 weeks ago

Apply

10.0 years

0 Lacs

Mumbai Metropolitan Region

On-site

Linkedin logo

Job Description Are You Ready to Make It Happen at Mondelēz International? Join our Mission to Lead the Future of Snacking. Make It With Pride. Together with analytics team leaders you will support our business with excellent data models to uncover trends that can drive long-term business results. How You Will Contribute You will: Work in close partnership with the business leadership team to execute the analytics agenda Identify and incubate best-in-class external partners to drive delivery on strategic projects Develop custom models/algorithms to uncover signals/patterns and trends to drive long-term business performance Execute the business analytics program agenda using a methodical approach that conveys to stakeholders what business analytics will deliver What You Will Bring A desire to drive your future and accelerate your career and the following experience and knowledge: Using data analysis to make recommendations to senior leaders Technical experience in roles in best-in-class analytics practices Experience deploying new analytical approaches in a complex and highly matrixed organization Savvy in usage of the analytics techniques to create business impacts In this role, you will be a key technical leader in developing our cutting-edge Supply Chain Data Product ecosystem. You'll have the opportunity to design, build, and automate data ingestion, harmonization, and transformation processes, driving advanced analytics, reporting, and insights to optimize Supply Chain performance across the organization. You will play an instrumental part in engineering robust and scalable data solutions, acting as a hands-on expert for Supply Chain data, and contributing to how these data products are visualized and interacted with. What you need to know about this position: Our team is at the forefront of building Data Products driving AI and Analytics across Mondelez. As a Sr. Data & Analytics Engineer within the Mondelez Data Product team, you will be working towards the following objectives: Build and Enhance Supply Chain Data Products: Design, develop, and maintain reusable Data Products as the single source of truth for Supply Chain, logistics, manufacturing, and related data domains. End-to-End Data Pipeline Development: Leverage data from diverse internal systems (especially SAP ECC/S4HANA, SAP BW/HANA, and other Supply Chain platforms) and external data sources, ingesting it into our centralized Google Cloud Platform (GCP) data platform. Enable Data Governance and Management: Implement and champion Data Governance and Data Management standards, including data cataloging, documentation, security protocols, robust data quality controls, master data management (MDM) principles, and data democratization. Scalable Data Modeling & Implementation: Design, implement, and optimize reusable and scalable data models following industry best practices and high coding standards, ensuring efficient data flow for analytical consumption. Hands-on Technical Leadership: Play a lead technical role throughout the entire Software Development Lifecycle (SDLC) – from requirements gathering and design to development, testing, deployment, and hypercare. Product and Value-Driven Mindset: Build each feature with a Product Owner mindset, focusing on delivering business value with efficiency, agility, and a strong bias for action. This includes considering the end-user experience of the data products. Technical Point of Contact: Act as a key technical expert and point of contact within the Data Engineering team for Supply Chain data initiatives. What extra ingredients you will bring: Proven hands-on experience designing and developing complex data models and high-quality, performant data pipelines. Passion for leveraging data to drive tangible improvements in Supply Chain efficiency, cost reduction, and operational excellence. Ability to thrive in an entrepreneurial, fast-paced setting, managing complex data challenges with a solutions-oriented approach. Excellent communication and collaboration skills to facilitate effective teamwork, engage with Supply Chain stakeholders, and explain complex data concepts to both technical and non-technical individuals. Strong problem-solving skills and business acumen, particularly within the Supply Chain domain. Education / Certifications: Bachelor's degree in Information Systems/Technology, Computer Science, Analytics, Engineering, or a related field. 10+ years of hands-on experience in data engineering, data warehousing, or a similar technical role, preferably in CPG or manufacturing with a strong focus on Supply Chain data. Job specific requirements (Hands-on Experience Focus): SAP Data Expertise: Deep hands-on experience in extracting, transforming, and modeling data from SAP ECC/S4HANA (modules like MM, SD, PP, QM, FI/CO) and SAP BW/HANA. Proven ability to understand SAP data structures and business processes within Supply Chain. Data Pipeline Development: Design, build, and maintain robust and efficient ETL/ELT processes for data integration, ensuring data accuracy, integrity, and timeliness. Cloud Data Engineering (GCP Focused): Strong proficiency and hands-on experience in data warehousing solutions and data engineering services within the Google Cloud Platform (GCP) ecosystem (e.g., BigQuery, Dataflow, Dataproc, Cloud Composer, Pub/Sub). Hands-on experience with Databricks (ideally deployed on GCP or with GCP integration) for large-scale data processing, Spark-based transformations, and advanced analytics is highly desirable. Data Modeling & Warehousing: Hands-on experience in developing efficient data models (e.g., dimensional, Data Vault) and building/maintaining data warehouses, primarily on GCP. Programming & Automation: Strong proficiency in SQL and Python for data manipulation, pipeline development, and automation. API Integration: Experience working with multiple APIs to extract and/or send data for system integration. BI & Analytics Enablement: Collaborate with data scientists, analysts, and business users to provide high-quality, reliable data for their analyses and models. Support the development of data consumption layers, including dashboards (e.g., Tableau, Power BI). UI/Front-End Experience (Desirable): Experience with front-end technologies and JavaScript frameworks (e.g., React JS, Angular, Vue.js) for building custom data visualizations, interactive dashboards, or user interfaces for Data Products is a significant plus. System Monitoring & Optimization: Monitor data processing systems and pipelines to ensure efficiency, reliability, performance, and uptime; proactively identify and resolve bottlenecks. DevOps & CI/CD: Prior hands-on work in a DevOps environment or strong understanding and experience implementing CI/CD pipelines for data solutions Data Governance & Security: Ensure compliance with data privacy regulations and implement data security standards and best practices within data solutions on GCP. Industry Knowledge: Solid understanding of the consumer goods industry, particularly Supply Chain processes and relevant key performance indicators (KPIs). Agile Methodologies: Experience working in Agile development environments. Continuous Learning: Stay updated on the latest technologies and best practices in data engineering, GCP data services, Databricks, UI development trends, and database management. Within Country Relocation support available and for candidates voluntarily moving internationally some minimal support is offered through our Volunteer International Transfer Policy Business Unit Summary At Mondelēz International, our purpose is to empower people to snack right by offering the right snack, for the right moment, made the right way. That means delivering a broad range of delicious, high-quality snacks that nourish life's moments, made with sustainable ingredients and packaging that consumers can feel good about. We have a rich portfolio of strong brands globally and locally including many household names such as Oreo , belVita and LU biscuits; Cadbury Dairy Milk , Milka and Toblerone chocolate; Sour Patch Kids candy and Trident gum. We are proud to hold the top position globally in biscuits, chocolate and candy and the second top position in gum. Our 80,000 makers and bakers are located in more than 80 countries and we sell our products in over 150 countries around the world. Our people are energized for growth and critical to us living our purpose and values. We are a diverse community that can make things happen—and happen fast. Mondelēz International is an equal opportunity employer and all qualified applicants will receive consideration for employment without regard to race, color, religion, gender, sexual orientation or preference, gender identity, national origin, disability status, protected veteran status, or any other characteristic protected by law. Job Type Regular Analytics & Modelling Analytics & Data Science Show more Show less

Posted 2 weeks ago

Apply

5.0 years

0 Lacs

Kochi, Kerala, India

On-site

Linkedin logo

Role Description Key Responsibilities Design, develop, and optimize ETL pipelines using PySpark on Google Cloud Platform (GCP). Work with BigQuery, Cloud Dataflow, Cloud Composer (Apache Airflow), and Cloud Storage for data transformation and orchestration. Develop and optimize Spark-based ETL processes for large-scale data processing. Implement best practices for data governance, security, and monitoring in a cloud environment. Collaborate with data engineers, analysts, and business stakeholders to understand data requirements. Troubleshoot performance bottlenecks and optimize Spark jobs for efficient execution. Automate data workflows using Apache Airflow or Cloud Composer. Ensure data quality, validation, and consistency across pipelines. 5+ years of experience in ETL development with a focus on PySpark. Strong hands-on experience with Google Cloud Platform (GCP) services, including: BigQuery Cloud Dataflow / Apache Beam Cloud Composer (Apache Airflow) Cloud Storage Proficiency in Python and PySpark for big data processing. Experience with data lake architectures and data warehousing concepts. Knowledge of SQL for data querying and transformation. Experience with CI/CD pipelines for data pipeline automation. Strong debugging and problem-solving skills. Experience with Kafka or Pub/Sub for real-time data processing. Knowledge of Terraform for infrastructure automation on GCP. Experience with containerization (Docker, Kubernetes). Familiarity with DevOps and monitoring tools like Prometheus, Stackdriver, or Datadog. Skills Gcp,Pyspark,Etl Show more Show less

Posted 2 weeks ago

Apply

7.0 years

0 Lacs

Bengaluru, Karnataka, India

On-site

Linkedin logo

Note: By applying to this position you will have an opportunity to share your preferred working location from the following: Hyderabad, Telangana, India; Bengaluru, Karnataka, India . Minimum qualifications: Bachelor's degree in Education, Instructional Design, a related field or equivalent practical experience. 7 years of experience with working in domestic and international environments managing vendors (e.g., suppliers, manufacturers) or third-party logistics. 7 years of experience in a customer or client-facing role supporting logistics operations. 5 years of experience in managing operations. Preferred qualifications: Experience using learning management systems to organize and deploy training at scale. Experience with scaled operations and defining learning and development strategies in a vendor first operating model. Ability to build partnerships with business partners and team members; ability to influence others towards desired outcomes. Familiarity with Connect Composer, Google's content management system, or other standard CMS software. Excellent communication skills to interact with executive leadership - with a well-focused attention to detail, to craft tactical narratives and to facilitate sessions. About The Job A problem isn’t truly solved until it’s solved for all. That’s why Googlers build products that help create opportunities for everyone, whether down the street or across the globe. As a Program Manager at Google, you’ll lead complex, multi-disciplinary projects from start to finish — working with stakeholders to plan requirements, manage project schedules, identify risks, and communicate clearly with cross-functional partners across the company. Your projects will often span offices, time zones, and hemispheres. It's your job to coordinate the players and keep them up to date on progress and deadlines. As Training Program Manager, you will help agents to delight customers by providing learner-centric training strategies. You will analyze trends and future launches to anticipate training needs and advocate the continual improvement of the agent training experience. Working with our cross-functional teams, you will create training project plans, review the design and development process, develop knowledge base articles, and coordinate with our vendors to ensure smooth delivery to our global service delivery centers. You will report training progress and evaluate the training to understand its effectiveness in preparing agents to provide quality service to our consumers. Additionally, you should have excellent leadership, organizational, problem-solving, networking, and communications skills. At YouTube, we believe that everyone deserves to have a voice, and that the world is a better place when we listen, share, and build community through our stories. We work together to give everyone the power to share their story, explore what they love, and connect with one another in the process. Working at the intersection of cutting-edge technology and boundless creativity, we move at the speed of culture with a shared goal to show people the world. We explore new ideas, solve real problems, and have fun — and we do it all together. Responsibilities Consult with business partners and stakeholders to determine the most effective training strategy to support products and workflows. Provide thought leadership to build the training program which balances for efficiency and effectiveness goals for the organization. Analyze trends in quality results, product launches, anticipate learning needs, and devise appropriate training interventions. Lead the development and implementation of curricular framework for all YouTube training that drive business results. Drive training development to completion by managing timelines, overseeing the instructional design process, and coordinating the delivery of training to our help centers. Google is proud to be an equal opportunity workplace and is an affirmative action employer. We are committed to equal employment opportunity regardless of race, color, ancestry, religion, sex, national origin, sexual orientation, age, citizenship, marital status, disability, gender identity or Veteran status. We also consider qualified applicants regardless of criminal histories, consistent with legal requirements. See also Google's EEO Policy and EEO is the Law. If you have a disability or special need that requires accommodation, please let us know by completing our Accommodations for Applicants form . Show more Show less

Posted 2 weeks ago

Apply

7.0 years

0 Lacs

Hyderabad, Telangana, India

On-site

Linkedin logo

Note: By applying to this position you will have an opportunity to share your preferred working location from the following: Hyderabad, Telangana, India; Bengaluru, Karnataka, India . Minimum qualifications: Bachelor's degree in Education, Instructional Design, a related field or equivalent practical experience. 7 years of experience with working in domestic and international environments managing vendors (e.g., suppliers, manufacturers) or third-party logistics. 7 years of experience in a customer or client-facing role supporting logistics operations. 5 years of experience in managing operations. Preferred qualifications: Experience using learning management systems to organize and deploy training at scale. Experience with scaled operations and defining learning and development strategies in a vendor first operating model. Ability to build partnerships with business partners and team members; ability to influence others towards desired outcomes. Familiarity with Connect Composer, Google's content management system, or other standard CMS software. Excellent communication skills to interact with executive leadership - with a well-focused attention to detail, to craft tactical narratives and to facilitate sessions. About The Job A problem isn’t truly solved until it’s solved for all. That’s why Googlers build products that help create opportunities for everyone, whether down the street or across the globe. As a Program Manager at Google, you’ll lead complex, multi-disciplinary projects from start to finish — working with stakeholders to plan requirements, manage project schedules, identify risks, and communicate clearly with cross-functional partners across the company. Your projects will often span offices, time zones, and hemispheres. It's your job to coordinate the players and keep them up to date on progress and deadlines. As Training Program Manager, you will help agents to delight customers by providing learner-centric training strategies. You will analyze trends and future launches to anticipate training needs and advocate the continual improvement of the agent training experience. Working with our cross-functional teams, you will create training project plans, review the design and development process, develop knowledge base articles, and coordinate with our vendors to ensure smooth delivery to our global service delivery centers. You will report training progress and evaluate the training to understand its effectiveness in preparing agents to provide quality service to our consumers. Additionally, you should have excellent leadership, organizational, problem-solving, networking, and communications skills. At YouTube, we believe that everyone deserves to have a voice, and that the world is a better place when we listen, share, and build community through our stories. We work together to give everyone the power to share their story, explore what they love, and connect with one another in the process. Working at the intersection of cutting-edge technology and boundless creativity, we move at the speed of culture with a shared goal to show people the world. We explore new ideas, solve real problems, and have fun — and we do it all together. Responsibilities Consult with business partners and stakeholders to determine the most effective training strategy to support products and workflows. Provide thought leadership to build the training program which balances for efficiency and effectiveness goals for the organization. Analyze trends in quality results, product launches, anticipate learning needs, and devise appropriate training interventions. Lead the development and implementation of curricular framework for all YouTube training that drive business results. Drive training development to completion by managing timelines, overseeing the instructional design process, and coordinating the delivery of training to our help centers. Google is proud to be an equal opportunity workplace and is an affirmative action employer. We are committed to equal employment opportunity regardless of race, color, ancestry, religion, sex, national origin, sexual orientation, age, citizenship, marital status, disability, gender identity or Veteran status. We also consider qualified applicants regardless of criminal histories, consistent with legal requirements. See also Google's EEO Policy and EEO is the Law. If you have a disability or special need that requires accommodation, please let us know by completing our Accommodations for Applicants form . Show more Show less

Posted 2 weeks ago

Apply

2.0 years

0 Lacs

Surat, Gujarat, India

On-site

Linkedin logo

We are seeking a highly skilled PHP Developer with deep experience in MySQL and the CodeIgniter framework. The ideal candidate should have a strong understanding of database design and advanced query optimization techniques to ensure application performance, especially in high-load environments. Key Responsibilities - Design, develop, and maintain scalable web applications using PHP (CodeIgniter) and MySQL. - Analyze slow-performing queries and implement in-depth optimization techniques such as indexing, query restructuring, and caching mechanisms. - Write clean, well-documented, and efficient code following best practices. - Perform regular performance tuning of MySQL databases including stored procedures, triggers, and views. Surat , Gujarat - Collaborate with frontend developers and product teams to deliver robust and scalable backend solutions. - Conduct thorough testing and debugging to ensure application reliability. - Implement RESTful APIs and third-party integrations where needed. - Maintain version control and collaborative workflows using Git. Required Skills & Qualifications - Strong command of PHP and MySQL, with a minimum of 2 years in CodeIgniter framework. - Deep understanding of SQL query optimization—EXPLAIN and profiling, indexing strategies (composite, covering, partial), query plan analysis and refactoring. - Experience in designing normalized and denormalized schemas for high-performance use cases. - Familiarity with caching techniques (Memcached, Redis) is a plus. - Knowledge of MVC architecture and clean code principles. - Proficient in using tools like MySQL Workbench, phpMyAdmin, or similar for query analysis. - Comfortable with Linux environments and command-line MySQL operations. - Good communication skills and ability to work in a team. - Hands-on experience with Git for version control, branching, and merging workflows. Preferred Qualifications - Experience working on high-traffic web applications. - Understanding of CI/CD pipelines. - Exposure to modern PHP development tools and practices (e.g., Composer, PSR standards). - Knowledge of front-end basics (HTML, CSS, JavaScript) is a plus. Why Join Us? - Opportunity to work on scalable, performance-critical applications. - Collaborative team environment focused on innovation and best practices. - Flexible work environment and career growth potential. Show more Show less

Posted 2 weeks ago

Apply

5.0 - 8.0 years

0 Lacs

Hyderabad, Telangana, India

On-site

Linkedin logo

Role : GCP Cloud Architect. Location : Hyderabad. Notice period : Immediate joiners needed. Shift timings : US Time zones. Work Mode : Work from Office. Job Description Opportunity : We are seeking a highly skilled and experienced GCP Cloud Architect to join our dynamic technology team. You will play a crucial role in designing, implementing, and managing our Google Cloud Platform (GCP) infrastructure, with a primary focus on building a robust and scalable Data Lake in BigQuery. You will be instrumental in ensuring the reliability, security, and performance of our cloud environment, supporting critical healthcare data initiatives. Cloud Architecture & Design This role requires strong technical expertise in GCP, excellent problem-solving abilities, and a passion for leveraging cloud technologies to drive impactful solutions within the healthcare : Design and architect scalable, secure, and cost-effective GCP solutions, with a strong emphasis on BigQuery for our Data Lake. Define and implement best GCP infrastructure management, security, networking, and data governance practices. Develop and maintain comprehensive architectural diagrams, documentation, and standards. Collaborate with data engineers, data scientists, and application development teams to understand their requirements and translate them into robust cloud solutions. Evaluate and recommend new GCP services and technologies to optimize our cloud environment. Understand and implement the fundamentals of GCP, including resource hierarchy, projects, organizations, and billing. GCP Infrastructure Management Manage and maintain our existing GCP infrastructure, ensuring high availability, performance, and security. Implement and manage infrastructure-as-code (IaC) using tools like Terraform or Cloud Deployment Manager. Monitor and troubleshoot infrastructure issues, proactively identifying and resolving potential problems. Implement and manage backup and disaster recovery strategies for our GCP environment. Optimize cloud costs and resource utilization, including BigQuery slot & Communication : Work closely with cross-functional teams, including data engineering, data science, application development, security, and compliance. Communicate technical concepts and solutions effectively to both technical and non-technical stakeholders. Provide guidance and mentorship to junior team members. Participate in on-call rotation as needed. Develop and maintain thorough and reliable documentation of all cloud infrastructure processes, configurations, and security : Bachelor's degree in Computer Science, Engineering, or a related field. Minimum of 5- 8 years of experience in designing, implementing, and managing cloud infrastructure, with a strong focus on Google Cloud Platform (GCP). Proven experience in architecting and implementing Data Lakes on GCP, specifically using BigQuery. Hands-on experience with ETL/ELT processes and tools, with strong proficiency in Google Cloud Composer (Apache Airflow). Solid understanding of GCP services such as Compute Engine, Cloud Storage, Networking (VPC, Firewall Rules, Cloud DNS), IAM, Cloud Monitoring, and Cloud Logging. Experience with infrastructure-as-code (IaC) tools like Terraform or Cloud Deployment Manager. Strong understanding of security best practices for cloud environments, including identity and access management, data encryption, and network security. Excellent problem-solving, analytical, and troubleshooting skills. Strong communication, collaboration, and interpersonal skills. Bonus Points Experience with Apigee for API management. Experience with containerization technologies like Docker and orchestration platforms like Cloud Run. Experience with Vertex AI for machine learning workflows on GCP. Familiarity with GCP Healthcare products and solutions (e.g., Cloud Healthcare API). Knowledge of healthcare data standards and regulations (e.g., HIPAA, HL7, FHIR). GCP Professional Architect certification. Experience with scripting languages (e.g., Python, Bash). Experience with Looker. (ref:hirist.tech) Show more Show less

Posted 2 weeks ago

Apply

0 years

0 Lacs

Pune, Maharashtra, India

On-site

Linkedin logo

_VOIS Intro About _VOIS: _VO IS (Vodafone Intelligent Solutions) is a strategic arm of Vodafone Group Plc, creating value and enhancing quality and efficiency across 28 countries, and operating from 7 locations: Albania, Egypt, Hungary, India, Romania, Spain and the UK. Over 29,000 highly skilled individuals are dedicated to being Vodafone Group’s partner of choice for talent, technology, and transformation. We deliver the best services across IT, Business Intelligence Services, Customer Operations, Business Operations, HR, Finance, Supply Chain, HR Operations, and many more. Established in 2006, _VO IS has evolved into a global, multi-functional organisation, a Centre of Excellence for Intelligent Solutions focused on adding value and delivering business outcomes for Vodafone. _VOIS Centre Intro About _VOIS India: In 2009, _VO IS started operating in India and now has established global delivery centres in Pune, Bangalore and Ahmedabad. With more than 14,500 employees, _VO IS India supports global markets and group functions of Vodafone, and delivers best-in-class customer experience through multi-functional services in the areas of Information Technology, Networks, Business Intelligence and Analytics, Digital Business Solutions (Robotics & AI), Commercial Operations (Consumer & Business), Intelligent Operations, Finance Operations, Supply Chain Operations and HR Operations and more. Job Role Related Content (Role specific) Key Responsibilities Design, develop, and maintain scalable data pipelines and ETL processes using GCP services such as BigQuery, Cloud Data Fusion, Dataflow, Pub/Sub, Cloud Storage, Composer ,Cloud Function, Cloud RUN Collaborate with data scientists, analysts, and other stakeholders to understand data requirements and deliver high-quality data solutions. Implement data integration solutions to ingest, process, and store large volumes of structured and unstructured data from various sources. Optimize and tune data pipelines for performance, reliability, and cost-efficiency. Ensure data quality and integrity through data validation, cleansing, and transformation processes. Develop and maintain data models, schemas, and metadata to support data analytics and reporting. Monitor and troubleshoot data pipeline issues, ensuring timely resolution and minimal disruption to data workflows. Stay up-to-date with the latest GCP technologies and best practices, and provide recommendations for continuous improvement. Mentor and guide junior data engineers, fostering a culture of knowledge sharing and collaboration. Proficiency in GCP services such as BigQuery, Cloud Data Fusion, Dataflow, Pub/Sub, Cloud Storage, Composer ,Cloud Function, Cloud RUN. Strong programming skills in Python, PLSQL. Experience with SQL and NoSQL databases. Knowledge of data warehousing concepts and best practices. Familiarity with data integration tools and frameworks. _VOIS Equal Opportunity Employer Commitment India _VO IS is proud to be an Equal Employment Opportunity Employer. We celebrate differences and we welcome and value diverse people and insights. We believe that being authentically human and inclusive powers our employees’ growth and enables them to create a positive impact on themselves and society. We do not discriminate based on age, colour, gender (including pregnancy, childbirth, or related medical conditions), gender identity, gender expression, national origin, race, religion, sexual orientation, status as an individual with a disability, or other applicable legally protected characteristics. As a result of living and breathing our commitment, our employees have helped us get certified as a Great Place to Work in India for four years running. We have been also highlighted among the Top 5 Best Workplaces for Diversity, Equity, and Inclusion , Top 10 Best Workplaces for Women , Top 25 Best Workplaces in IT & IT-BPM and 14th Overall Best Workplaces in India by the Great Place to Work Institute in 2023. These achievements position us among a select group of trustworthy and high-performing companies which put their employees at the heart of everything they do. By joining us, you are part of our commitment. We look forward to welcoming you into our family which represents a variety of cultures, backgrounds, perspectives, and skills! Apply now, and we’ll be in touch! Show more Show less

Posted 2 weeks ago

Apply

6.0 years

0 Lacs

Pune, Maharashtra, India

On-site

Linkedin logo

Position Overview Job Title- Data Engineer (ETL, Big Data, Hadoop, Spark, GCP), AVP Location- Pune, India Role Description Engineer is responsible for developing and delivering elements of engineering solutions to accomplish business goals. Awareness is expected of the important engineering principles of the bank. Root cause analysis skills develop through addressing enhancements and fixes 2 products build reliability and resiliency into solutions through early testing peer reviews and automating the delivery life cycle. Successful candidate should be able to work independently on medium to large sized projects with strict deadlines. Successful candidates should be able to work in a cross application mixed technical environment and must demonstrate solid hands-on development track record while working on an agile methodology. The role demands working alongside a geographically dispersed team. The position is required as a part of the buildout of Compliance tech internal development team in India. The overall team will primarily deliver improvements in Com in compliance tech capabilities that are major components of the regular regulatory portfolio addressing various regulatory common commitments to mandate monitors. What We’ll Offer You As part of our flexible scheme, here are just some of the benefits that you’ll enjoy Best in class leave policy Gender neutral parental leaves 100% reimbursement under childcare assistance benefit (gender neutral) Sponsorship for Industry relevant certifications and education Employee Assistance Program for you and your family members Comprehensive Hospitalization Insurance for you and your dependents Accident and Term life Insurance Complementary Health screening for 35 yrs. and above Your Key Responsibilities Analyzing data sets and designing and coding stable and scalable data ingestion workflows also integrating into existing workflows Working with team members and stakeholders to clarify requirements and provide the appropriate ETL solution. Work as a senior developer for developing analytics algorithm on top of ingested data. Work as though senior developer for various data sourcing in Hadoop also GCP. Ensuring new code is tested both at unit level and system level design develop and peer review new code and functionality. Operate as a team member of an agile scrum team. Root cause analysis skills to identify bugs and issues for failures. Support Prod support and release management teams in their tasks. Your skills and experience: More than 6+ years of coding experience in experience and reputed organizations Hands on experience in Bitbucket and CI/CD pipelines Proficient in Hadoop, Python, Spark ,SQL Unix and Hive Basic understanding of on Prem and GCP data security Hands on development experience on large ETL/ big data systems .GCP being a big plus Hands on experience on cloud build, artifact registry ,cloud DNS ,cloud load balancing etc. Hands on experience on Data flow, Cloud composer, Cloud storage ,Data proc etc. Basic understanding of data quality dimensions like Consistency, Completeness, Accuracy, Lineage etc. Hands on business and systems knowledge gained in a regulatory delivery environment. Banking experience regulatory and cross product knowledge. Passionate about test driven development. Prior experience with release management tasks and responsibilities. Data visualization experience in tableau is good to have. How We’ll Support You Training and development to help you excel in your career. Coaching and support from experts in your team A culture of continuous learning to aid progression. A range of flexible benefits that you can tailor to suit your needs. About Us And Our Teams Please visit our company website for further information: https://www.db.com/company/company.htm We strive for a culture in which we are empowered to excel together every day. This includes acting responsibly, thinking commercially, taking initiative and working collaboratively. Together we share and celebrate the successes of our people. Together we are Deutsche Bank Group. We welcome applications from all people and promote a positive, fair and inclusive work environment. Show more Show less

Posted 2 weeks ago

Apply

6.0 - 11.0 years

15 - 30 Lacs

Hyderabad, Pune, Bengaluru

Work from Office

Naukri logo

Warm welcome from SP Staffing Services! Reaching out to you regarding permanent opportunity !! Job Description: Exp: 6-12 yrs Location: Hyderabad/Bangalore/Pune/Delhi Skill: GCP Data Engineer - Proficiency in programming languages: Python - Expertise in data processing frameworks: Apache Beam (Data Flow), Kafka, - Hands-on experience with GCP services: Big Query, Dataflow, Composer, Spanner - Knowledge of data modeling and database design - Experience in ETL (Extract, Transform, Load) processes - Familiarity with cloud storage solutions - Strong problem-solving abilities in data engineering challenges - Understanding of data security and scalability - Proficiency in relevant tools like Apache Airflow Interested can share your resume to sangeetha.spstaffing@gmail.com with below inline details. Full Name as per PAN: Mobile No: Alt No/ Whatsapp No: Total Exp: Relevant Exp in GCP: Rel Exp in Big Query: Rel Exp in Composer: Rel Exp in Python: Current CTC: Expected CTC: Notice Period (Official): Notice Period (Negotiable)/Reason: Date of Birth: PAN number: Reason for Job Change: Offer in Pipeline (Current Status): Availability for virtual interview on weekdays between 10 AM- 4 PM(plz mention time): Current Res Location: Preferred Job Location: Whether educational % in 10th std, 12th std, UG is all above 50%? Do you have any gaps in between your education or Career? If having gap, please mention the duration in months/year:

Posted 2 weeks ago

Apply

3.0 years

0 Lacs

Chennai, Tamil Nadu, India

On-site

Linkedin logo

Around 3+ years of experience in development in Data warehousing projects in GCP platforms, Good in analytical and problem solving, effective business communication and work independently end to end Strong expertise in SQL & PL/SQL Must have GCP services & BigQuery knowledge, GCP certification is added advantage Good experience in GCP Dataproc, Cloud Composer, DAGs, airflow Good experience in Teradata or any database Python is added advantage Ability to lead the team members, convert requirements into technical solutions, assign the work, coordinate the team and review the deliverables and ensure on schedule. Show more Show less

Posted 2 weeks ago

Apply

3.0 years

0 Lacs

Pune, Maharashtra, India

On-site

Linkedin logo

About VOIS VO IS (Vodafone Intelligent Solutions) is a strategic arm of Vodafone Group Plc, creating value and enhancing quality and efficiency across 28 countries, and operating from 7 locations: Albania, Egypt, Hungary, India, Romania, Spain and the UK. Over 29,000 highly skilled individuals are dedicated to being Vodafone Group’s partner of choice for talent, technology, and transformation. We deliver the best services across IT, Business Intelligence Services, Customer Operations, Business Operations, HR, Finance, Supply Chain, HR Operations, and many more. Established in 2006, VO IS has evolved into a global, multi-functional organisation, a Centre of Excellence for Intelligent Solutions focused on adding value and delivering business outcomes for Vodafone. About VOIS India In 2009, VO IS started operating in India and now has established global delivery centres in Pune, Bangalore and Ahmedabad. With more than 14,500 employees, VO IS India supports global markets and group functions of Vodafone, and delivers best-in-class customer experience through multi-functional services in the areas of Information Technology, Networks, Business Intelligence and Analytics, Digital Business Solutions (Robotics & AI), Commercial Operations (Consumer & Business), Intelligent Operations, Finance Operations, Supply Chain Operations and HR Operations and more. Job Role Key Responsibilities: Design and manage scalable infrastructure on Google Cloud Platform (GCP) to support various application and data workloads. Implement and manage IAM policies, roles, and permissions to ensure secure access across GCP services. Build and optimize workflows using Cloud Composer (Airflow) and manage data processing pipelines via Dataproc. Provision and maintain Compute Engine VMs and integrate them into broader system architectures. Set up and query data in BigQuery, and manage data flows securely and efficiently. Develop and maintain CI/CD pipelines using Argo CD, Jenkins, or GitOps methodologies. Administer Kubernetes clusters (GKE) including node scaling, workload deployments, and Helm chart management. Create and maintain YAML files for defining infrastructure as a code. Monitor system health and performance using tools like Prometheus, Grafana, and GCP’s native monitoring stack. Troubleshoot infrastructure issues, perform root cause analysis, and implement preventative measures. Collaborate with development teams to integrate infrastructure best practices and support application delivery. Document infrastructure standards, deployment processes, and operational procedures. Participate in Agile ceremonies, contributing to sprint planning, daily stand-ups, and retrospectives. Experience Must-Have: At least 3+ years of hands-on experience with Google Cloud Platform (GCP). Strong understanding and hands-on experience with IAM, Cloud Composer, Dataproc, Compute VMs, BigQuery, and networking on GCP. Proven experience with Kubernetes cluster administration (GKE preferred). Experience with CI/CD pipelines using tools like Argo CD, Jenkins, or GitOps workflows. Experience in writing and managing Helm charts for Kubernetes deployments. Proficiency in scripting (Bash, Python, or similar) for automation. Familiarity with observability and monitoring tools (Prometheus, Grafana, Cloud Monitoring). Solid understanding of DevOps practices, infrastructure-as-code, and container orchestration. Good To Have Experience working in an Agile environment. Familiarity with data workflows, data engineering pipelines, or ML orchestration on GCP. Exposure to GitHub Actions, Spinnaker, or other CI/CD tools. Experience with service mesh (Istio, Linkerd) and secrets management tools (Vault, Secret Manager). India VOIS Equal Opportunity Employer Commitment: VO IS is proud to be an Equal Employment Opportunity Employer. We celebrate differences and we welcome and value diverse people and insights. We believe that being authentically human and inclusive powers our employees’ growth and enables them to create a positive impact on themselves and society. We do not discriminate based on age, colour, gender (including pregnancy, childbirth, or related medical conditions), gender identity, gender expression, national origin, race, religion, sexual orientation, status as an individual with a disability, or other applicable legally protected characteristics. As a result of living and breathing our commitment, our employees have helped us get certified as a Great Place to Work in India for four years running. We have been also highlighted among the Top 10 Best Workplaces for Millennials, Equity, and Inclusion , Top 50 Best Workplaces for Women , Top 25 Best Workplaces in IT & IT-BPM and 10th Overall Best Workplaces in India by the Great Place to Work Institute in 2024. These achievements position us among a select group of trustworthy and high-performing companies which put their employees at the heart of everything they do. By joining us, you are part of our commitment. We look forward to welcoming you into our family which represents a variety of cultures, backgrounds, perspectives, and skills! Apply now, and we’ll be in touch! Show more Show less

Posted 2 weeks ago

Apply

5.0 - 9.0 years

0 Lacs

Noida, Uttar Pradesh, India

On-site

Linkedin logo

Arcadis is the world's leading company delivering sustainable design, engineering, and consultancy solutions for natural and built assets. We are more than 36,000 people, in over 70 countries, dedicated to improving quality of life. Everyone has an important role to play. With the power of many curious minds, together we can solve the world’s most complex challenges and deliver more impact together. Role Purpose: You will join our Water Team, where we are creating and analysing spatial data for planning, resource management, habitat conservation, wetland restoration, hydro-morphological measures or river regulation, energy projects etc. Autodesk platforms sustain our aim to provide competent, reliable skilled solutions in a cost-effective way. We are here to protect our natural environment and water resources, while powering our world for future generations. Your role will include, but is not limited to: Design works for various water projects: Flood protection projects and Wastewater system networks. To study and analyse the details involved in the survey reports and any other data that has the geological or topographical details and to pay attention to the details in the blueprints, maps, and other related drawings. Dike reinforcement solutions, detailing special hydro constructions. Help in creating concepts and designing structures to support the assigned purpose. Maintain the contact with the line manager and deliver the input in time/budget. Qualifications & Experience: Mandatory: Bachelor’s Degree (Civil Engineering); Good understanding of hydraulic structures design. Strong Civil3D practical knowledge. 5 to 9 years of Professional experience in wastewater systems or flood projects, in a design position. Sound communication & writing skills Nice to have: Subassembly composer, Dynamo or other automation tools. Key attributes: Professional approach to time, costs and deadlines. Great interpersonal skills (teamwork, supportive attitude, eager to learn, proactivity). Why Arcadis? We can only achieve our goals when everyone is empowered to be their best. We believe everyone's contribution matters. It’s why we are pioneering a skills-based approach, where you can harness your unique experience and expertise to carve your career path and maximize the impact we can make together. You’ll do meaningful work, and no matter what role, you’ll be helping to deliver sustainable solutions for a more prosperous planet. Make your mark, on your career, your colleagues, your clients, your life and the world around you. Together, we can create a lasting legacy. Our Commitment to Equality, Diversity, Inclusion & Belonging We want you to be able to bring your best self to work every day which is why equality and inclusion is at the forefront of all our activities. Our ambition is to be an employer of choice and provide a great place to work for all our people. We are an equal opportunity employer; women, minorities, and people with disabilities are strongly encouraged to apply. We are dedicated to a policy of non-discrimination in employment on any basis including race, caste, creed, colour, religion, sex, age, disability, marital status, sexual orientation, and gender identity. #Join Arcadis. #Create a Legacy. #Hybrid Show more Show less

Posted 2 weeks ago

Apply

5.0 years

0 Lacs

Noida

On-site

GlassDoor logo

As a Data Engineer with a focus on migrating on-premises databases to Google Cloud SQL, you will play a critical role in solving complex problems and creating value for our business by ensuring reliable, scalable,and efficient data migration processes. You will be responsible for architecting,designing and implementing custom pipelines on the GCP stack to facilitate seamless migration. Required Skills: 5+ years of industry experience in data engineering, business intelligence, or a related field with experience in manipulating, processing, and extracting value from datasets. Expertise in architecting, designing, building, and deploying internal applications to support technology life cycle management, service delivery management, data, and business intelligence. Experience in developing modular code for versatile pipelines or complex ingestion frameworks aimed at loading data into Cloud SQL and managing data migration from multiple on-premises sources. Strong collaboration with analysts and business process owners to translate business requirements into technical solutions. Proficiency in coding with scripting languages (Shell scripting, Python, SQL). Deep understanding and hands-on experience with Google Cloud Platform (GCP) technologies, especially in data migration and warehousing, including Database Migration Service (DMS), Cloud SQL, BigQuery, Dataflow, Data Catalog, Cloud Composer, Google Cloud Storage (GCS), IAM, Compute Engine, Cloud Data Fusion, and optionally Dataproc. Adherence to best development practices including technical design, solution development, systems configuration, test documentation/execution, issue identification and resolution, and writing clean, modular, self-sustaining code. Familiarity with CI/CD processes using GitHub, Cloud Build, and Google Cloud SDK.

Posted 2 weeks ago

Apply

0 years

0 Lacs

Noida

On-site

GlassDoor logo

As a Data Engineer, you will design, develop, and support data pipelines and related data products and platforms. Your primary responsibilities include designing and building data extraction, loading, and transformation pipelines across on-prem and cloud platforms. You will perform application impact assessments, requirements reviews, and develop work estimates. Additionally, you will develop test strategies and site reliability engineering measures for data products and solutions, participate in agile development "scrums" and solution reviews, mentor junior Data Engineering Specialists, lead the resolution of critical operations issues, and perform technical data stewardship tasks, including metadata management, security, and privacy by design. Required Skills: Design, develop, and support data pipelines and related data products and platforms. Design and build data extraction, loading, and transformation pipelines and data products across on-prem and cloud platforms. Perform application impact assessments, requirements reviews, and develop work estimates. Develop test strategies and site reliability engineering measures for data products and solutions. Participate in agile development "scrums" and solution reviews. Mentor junior Data Engineers. Lead the resolution of critical operations issues, including post-implementation reviews. Perform technical data stewardship tasks, including metadata management, security, and privacy by design. Design and build data extraction, loading, and transformation pipelines using Python and other GCP Data Technologies Demonstrate SQL and database proficiency in various data engineering tasks. Automate data workflows by setting up DAGs in tools like Control-M, Apache Airflow, and Prefect. Develop Unix scripts to support various data operations. Model data to support business intelligence and analytics initiatives. Utilize infrastructure-as-code tools such as Terraform, Puppet, and Ansible for deployment automation. Expertise in GCP data warehousing technologies, including BigQuery, Cloud SQL, Dataflow, Data Catalog, Cloud Composer, Google Cloud Storage, IAM, Compute Engine, Cloud Data Fusion and Dataproc (good to have).

Posted 2 weeks ago

Apply

0 years

0 Lacs

Pune, Maharashtra, India

On-site

Linkedin logo

Position Overview Job Title - Engineer Location- Bangalore, India Role Description We're seeking a talented and enthusiastic Java Engineer with DevOps and management skills. In this role, you'll be a key player in designing, developing, and deploying robust back-end systems that power our applications. Experience with cloud deployments is a big plus! The IMS (Identity Management Services) team are part of the Identity & Access Engineering and Architecture function. The team design, build, and operate strategic solutions for managing the request, approval, provision, review, and revocation of access to IT assets across the organization. IMS team manages 6 applications: dbSRS, dB Passport, PDS, SIMS, ISDW, Jupiter; dbSRS : a SOA BPM based application created for ID Administration processes and request capture interfaces. dB Passport: is an application for managing and monitoring change to the Bank's Network Information system. ISDW : The Information Security Data Warehouse is a reporting system that matches all of Deutsche Bank's system accounts with the people records of HR PDS : Passwords are registered with PDS either manually or via a Web Service interface and subsequently stored in an encrypted format within the PDS database. SIMS : application is an interface to active service directory to perform operations like create, modify, and delete on ASD objects for users and groups. The solutions manage processes driven by Human Resources such as Joiners, Movers, and Leavers, orchestrated automatically in conjunction with other services across the organization, or by user request. Our solutions have global scope, reaching everyone in the organization, and managing access to infrastructure, data, and a large number (several thousands) of business applications. What We’ll Offer You As part of our flexible scheme, here are just some of the benefits that you’ll enjoy Best in class leave policy Gender neutral parental leaves 100% reimbursement under childcare assistance benefit (gender neutral) Sponsorship for Industry relevant certifications and education Employee Assistance Program for you and your family members Comprehensive Hospitalization Insurance for you and your dependents Accident and Term life Insurance Complementary Health screening for 35 yrs. and above Your Key Responsibilities In General: technical requirements: Design, develop, and maintain scalable, reliable and efficient back-end services using Java and the Spring Framework. Migrate, deploy, manage, and monitor applications in cloud environments (GCP). Ensure code quality and maintainability through best practices like code reviews, unit testing, and documentation. Troubleshoot and resolve complex technical issues in a timely manner. Continuously improve our development processes and tools. Management requirements: Manage our stakeholders and report deliverables in our Senior Management sessions. Understand and consider operational, regulatory, and other risks related to the IMS applications. Engage with multiple technical and non-technical stakeholders within and outside the bank to resolve issues and dependencies. Report to management on the critical incidents and work with vendor for the resolution plan Your Skills And Experience Technical Skills: Strong proficiency in Java and the Spring Framework (Spring MVC, Spring Security, etc.). Cloud: Familiarity with cloud platforms (GCP) and experience in deploying and managing applications in the cloud. Database: Solid understanding of relational databases (eg Oracle) Soft Skills: Problem-Solving: Excellent analytical and problem-solving skills Strong team player, able to work in virtual, global teams in a matrix organization, cross location and with mix of vendors and internals. Strong relationship management, analytical, problem solving, strategic agility, communication, influencing and presentation skills. Proven “pick up the phone and get things sorted”-attitude. Experiences: Experience with Identity and Access Management will be a plus. Good knowledge of application servers (WebLogic server, Tomcat, Jboss) Good knowledge of Linux, Unix (AIX, Solaris) distros, Windows systems Experience with any continuous integration and continuous delivery tools (TeamCity, Jenkins etc.) Good knowledge of any build tool (TeamCity, etc.) Good knowledge of any version control system (GitHub, GIT, etc.) Experience with any scripting language (bash, powershell, etc.) Experience with deployment, configuration, and support of J2EE applications Experience with secure environments (certificates, secure connections, etc) Experience with Google Cloud services (Terraform, BigQuery, Looker, Cloud Composer, etc) or willingness to learn. How We’ll Support You Training and development to help you excel in your career Coaching and support from experts in your team A culture of continuous learning to aid progression A range of flexible benefits that you can tailor to suit your needs About Us And Our Teams Please visit our company website for further information: https://www.db.com/company/company.htm We strive for a culture in which we are empowered to excel together every day. This includes acting responsibly, thinking commercially, taking initiative and working collaboratively. Together we share and celebrate the successes of our people. Together we are Deutsche Bank Group. We welcome applications from all people and promote a positive, fair and inclusive work environment. Show more Show less

Posted 2 weeks ago

Apply

7.0 years

0 Lacs

Gurugram, Haryana, India

On-site

Linkedin logo

The Technical Lead / Technical Consultant is a core role and focal point of the project team responsible for the whole technical solution and managing the day-to-day delivery. The role will focus on the technical solution architecture, detailed technical design, coaching of the development/implementation team, and governance of the technical delivery. Technical ownership of the solution from bid inception through implementation to client delivery, followed by after-sales support and best practice advice. Interactions with internal stakeholders and clients to explain technology solutions and a clear understanding of client’s business requirements through which to guide optimal design to meet their needs. Job Description: Must-Have Skills: Database (one or more of MS SQL Server, Oracle, Cloud SQL, Cloud Spanner, etc.) Data Warehouse (one or more of Big Query, SnowFlake, etc.) ETL tool (two or more of Cloud Data Fusion, Dataflow, Dataproc, Pub/Sub, Composer, Cloud Functions, Cloud Run, etc.) Experience in Cloud platforms - GCP Python, PySpark, Project & resource management SVN, JIRA, Automation workflow (Composer, Cloud Scheduler, Apache Airflow, Tidal, Tivoli or similar) Good to have Skills: UNIX shell scripting, SnowFlake, Redshift, Familiar with NoSQL such as MongoDB, etc ETL tool (Databricks / AWS Glue / AWS Lambda / Amazon Kinesis / Amazon Firehose / Azure Data Factory / ADF / DBT / Talend, Informatica, IICS (Informatica cloud) ) Experience in Cloud platforms - AWS / Azure Client-facing skills Key Responsibilities: Ability to design simple to medium data solutions for clients by using cloud architecture using GCP Strong understanding of DW, data mart, data modeling, data structures, databases, and data ingestion and transformation. Working knowledge of ETL as well as database skills Working knowledge of data modeling, data structures, databases, and ETL processes Strong understanding of relational and non-relational databases and when to use them Leadership and communication skills to collaborate with local leadership as well as our global teams Translating technical requirements into ETL/ SQL application code Document project architecture, explain the detailed design to the team, and create low-level to high-level design Create technical documents for ETL and SQL developments using Visio, PowerPoint, and other MS Office package Will need to engage with Project Managers, Business Analysts, and Application DBA to implement ETL Solutions Perform mid to complex-level tasks independently Support Clients, Data Scientists, and Analytical Consultants working on marketing solution Work with cross-functional internal teams and external clients Strong project management and organization skills . Ability to lead 1 – 2 projects of team size 2 – 3 team members. Code management systems which include Code review, deployment, cod Work closely with the QA / Testing team to help identify/implement defect reduction initiatives Work closely with the Architecture team to make sure Architecture standards and principles are followed during development Performing Proof of Concepts on new platforms/validating proposed solutions Work with the team to establish and reinforce disciplined software development, processes, standards, and error recovery procedures are deployed Must understand software development methodologies including waterfall and agile Distribute and manage SQL development Work across the team The candidate must be willing to work during overlapping hours with US-based teams to ensure effective collaboration and communication, typically between [e.g., 6:00 PM to 11:00 PM IST], depending on project needs. Qualifications: Bachelor’s or Master's Degree in Computer Science with >= 7 years of IT experience Location: Bangalore Brand: Merkle Time Type: Full time Contract Type: Permanent Show more Show less

Posted 2 weeks ago

Apply

7.0 years

0 Lacs

New Delhi, Delhi, India

On-site

Linkedin logo

The Technical Lead / Technical Consultant is a core role and focal point of the project team responsible for the whole technical solution and managing the day-to-day delivery. The role will focus on the technical solution architecture, detailed technical design, coaching of the development/implementation team, and governance of the technical delivery. Technical ownership of the solution from bid inception through implementation to client delivery, followed by after-sales support and best practice advice. Interactions with internal stakeholders and clients to explain technology solutions and a clear understanding of client’s business requirements through which to guide optimal design to meet their needs. Job Description: Must-Have Skills: Database (one or more of MS SQL Server, Oracle, Cloud SQL, Cloud Spanner, etc.) Data Warehouse (one or more of Big Query, SnowFlake, etc.) ETL tool (two or more of Cloud Data Fusion, Dataflow, Dataproc, Pub/Sub, Composer, Cloud Functions, Cloud Run, etc.) Experience in Cloud platforms - GCP Python, PySpark, Project & resource management SVN, JIRA, Automation workflow (Composer, Cloud Scheduler, Apache Airflow, Tidal, Tivoli or similar) Good to have Skills: UNIX shell scripting, SnowFlake, Redshift, Familiar with NoSQL such as MongoDB, etc ETL tool (Databricks / AWS Glue / AWS Lambda / Amazon Kinesis / Amazon Firehose / Azure Data Factory / ADF / DBT / Talend, Informatica, IICS (Informatica cloud) ) Experience in Cloud platforms - AWS / Azure Client-facing skills Key Responsibilities: Ability to design simple to medium data solutions for clients by using cloud architecture using GCP Strong understanding of DW, data mart, data modeling, data structures, databases, and data ingestion and transformation. Working knowledge of ETL as well as database skills Working knowledge of data modeling, data structures, databases, and ETL processes Strong understanding of relational and non-relational databases and when to use them Leadership and communication skills to collaborate with local leadership as well as our global teams Translating technical requirements into ETL/ SQL application code Document project architecture, explain the detailed design to the team, and create low-level to high-level design Create technical documents for ETL and SQL developments using Visio, PowerPoint, and other MS Office package Will need to engage with Project Managers, Business Analysts, and Application DBA to implement ETL Solutions Perform mid to complex-level tasks independently Support Clients, Data Scientists, and Analytical Consultants working on marketing solution Work with cross-functional internal teams and external clients Strong project management and organization skills . Ability to lead 1 – 2 projects of team size 2 – 3 team members. Code management systems which include Code review, deployment, cod Work closely with the QA / Testing team to help identify/implement defect reduction initiatives Work closely with the Architecture team to make sure Architecture standards and principles are followed during development Performing Proof of Concepts on new platforms/validating proposed solutions Work with the team to establish and reinforce disciplined software development, processes, standards, and error recovery procedures are deployed Must understand software development methodologies including waterfall and agile Distribute and manage SQL development Work across the team The candidate must be willing to work during overlapping hours with US-based teams to ensure effective collaboration and communication, typically between [e.g., 6:00 PM to 11:00 PM IST], depending on project needs. Qualifications: Bachelor’s or Master's Degree in Computer Science with >= 7 years of IT experience Location: Bangalore Brand: Merkle Time Type: Full time Contract Type: Permanent Show more Show less

Posted 2 weeks ago

Apply

Exploring Composer Jobs in India

India has a growing market for composer jobs, with various opportunities available for talented individuals in the music industry. Whether it's creating music for films, television, video games, or other media, composers play a vital role in shaping the overall experience for audiences. If you're considering a career in composing, here's a guide to help you navigate the job market in India.

Top Hiring Locations in India

  1. Mumbai
  2. Chennai
  3. Bangalore
  4. Hyderabad
  5. Delhi

These cities are known for their vibrant entertainment industries and often have a high demand for composers across various projects.

Average Salary Range

The average salary range for composer professionals in India can vary depending on experience and expertise. Entry-level composers can expect to earn between INR 3-5 lakhs per year, while experienced composers with a strong portfolio can earn upwards of INR 10 lakhs per year.

Career Path

In the field of composing, a typical career path may involve starting as a Junior Composer, then progressing to a Composer, Senior Composer, and eventually a Music Director or Lead Composer. As you gain more experience and recognition for your work, you may have the opportunity to work on larger projects and collaborate with well-known artists.

Related Skills

In addition to composing skills, it is beneficial for composers to have a good understanding of music theory, proficiency in music production software, excellent communication skills for collaborating with directors and producers, and the ability to work under tight deadlines.

Interview Questions

  • What inspired you to pursue a career in composing? (basic)
  • Can you walk us through your creative process when composing music for a project? (medium)
  • How do you handle feedback and revisions from clients or directors? (medium)
  • Can you discuss a challenging project you worked on and how you overcame obstacles during the composition process? (advanced)
  • How do you stay updated on current trends in the music industry and incorporate them into your work? (medium)
  • Have you ever had to compose music for a project with a tight deadline? How did you manage your time effectively? (medium)
  • Can you provide examples of different genres or styles of music you are comfortable composing? (medium)
  • How do you ensure that your music aligns with the overall vision of a project? (advanced)
  • Have you ever collaborated with other musicians or artists on a composition? How did you approach that collaboration? (medium)
  • What software or tools do you use for composing and producing music? (basic)
  • Can you discuss a piece of music you composed that you are particularly proud of? (medium)
  • How do you handle creative blocks or moments of inspiration? (medium)
  • What is your experience working with live musicians or orchestras for recording sessions? (advanced)
  • How do you approach negotiating fees or contracts for your composition work? (medium)
  • Can you discuss a project where you had to compose music for a specific cultural or historical context? (advanced)
  • How do you ensure that your music is original and does not infringe on copyright laws? (medium)
  • Have you ever had to rework a composition multiple times based on client feedback? How did you handle that situation? (medium)
  • What do you think sets your composing style apart from others in the industry? (medium)
  • How do you approach creating a memorable and impactful musical theme for a project? (medium)
  • Can you discuss a project where you had to compose music for a non-traditional or experimental medium? (advanced)
  • How do you balance artistic integrity with meeting the client's expectations and requirements? (medium)
  • What is your process for creating a soundtrack that enhances the emotional impact of a scene in a film or game? (medium)
  • Can you discuss a time when you had to compose music that evoked a specific mood or atmosphere? (medium)
  • How do you approach collaborating with sound designers or audio engineers to enhance the overall sound of a project? (medium)

Closing Remark

As you prepare for composer roles in India, remember to showcase your unique talents and passion for music in your portfolio and interviews. With dedication and creativity, you can pursue a rewarding career in composing and contribute to the vibrant entertainment industry in India. Good luck with your job search!

cta

Start Your Job Search Today

Browse through a variety of job opportunities tailored to your skills and preferences. Filter by location, experience, salary, and more to find your perfect fit.

Job Application AI Bot

Job Application AI Bot

Apply to 20+ Portals in one click

Download Now

Download the Mobile App

Instantly access job listings, apply easily, and track applications.

Featured Companies