Jobs
Interviews

905 Data Flow Jobs - Page 16

Setup a job Alert
JobPe aggregates results for easy application access, but you actually apply on the job portal directly.

4.0 - 8.0 years

10 - 14 Lacs

Chennai

Work from Office

Role Description Provides leadership for the overall architecture, design, development, and deployment of a full-stack cloud native data analytics platform. Designing & Augmenting Solution architecture for Data Ingestion, Data Preparation, Data Transformation, Data Load, ML & Simulation Modelling, Java BE & FE, State Machine, API Management & Intelligence consumption using data products, on cloud Understand Business Requirements and help in developing High level and Low-level Data Engineering and Data Processing Documentation for the cloud native architecture Developing conceptual, logical and physical target-state architecture, engineering and operational specs. Work with the customer, users, technical architects, and application designers to define the solution requirements and structure for the platform Model and design the application data structure, storage, and integration Lead the database analysis, design, and build effort Work with the application architects and designers to design the integration solution Ensure that the database designs fulfill the requirements, including data volume, frequency needs, and long-term data growth Able to perform Data Engineering tasks using Spark Knowledge of developing efficient frameworks for development and testing using (Sqoop/Nifi/Kafka/Spark/Streaming/ WebHDFS/Python) to enable seamless data ingestion processes on to the Hadoop/BigQuery platforms. Enabling Data Governance and Data Discovery Exposure of Job Monitoring framework along validations automation Exposure of handling structured, Un Structured and Streaming data. Technical Skills Experience with building data platform on cloud (Data Lake, Data Warehouse environment, Databricks) Strong technical understanding of data modeling, design and architecture principles and techniques across master data, transaction data and derived/analytic data Proven background of designing and implementing architectural solutions which solve strategic and tactical business needs Deep knowledge of best practices through relevant experience across data-related disciplines and technologies, particularly for enterprise-wide data architectures, data management, data governance and data warehousing Highly competent with database design Highly competent with data modeling Strong Data Warehousing and Business Intelligence skills or including: Handling ELT and scalability issues for enterprise level data warehouse Creating ETLs/ELTs to handle data from various data sources and various formats Strong hands-on experience of programming language like Python, Scala with Spark and Beam. Solid hands-on and Solution Architecting experience in Cloud Technologies Aws, Azure and GCP (GCP preferred) Hands on working experience of data processing at scale with event driven systems, message queues (Kafka/ Flink/Spark Streaming) Hands on working Experience with GCP Services like BigQuery, DataProc, PubSub, Dataflow, Cloud Composer, API Gateway, Datalake, BigTable, Spark, Apache Beam, Feature Engineering/Data Processing to be used for Model development Experience gathering and processing raw data at scale (including writing scripts, web scraping, calling APIs, write SQL queries, etc.) Experience building data pipelines for structured/unstructured, real-time/batch, events/synchronous/ asynchronous using MQ, Kafka, Steam processing Hands-on working experience in analyzing source system data and data flows, working with structured and unstructured data Must be very strong in writing SparkSQL queries Strong organizational skills, with the ability to work autonomously as well as leading a team Pleasant Personality, Strong Communication & Interpersonal Skills Qualifications A bachelor's degree in computer science, computer engineering, or a related discipline is required to work as a technical lead Certification in GCP would be a big plus Individuals in this field can further display their leadership skills by completing the Project Management Professional certification offered by the Project Management Institute.

Posted 1 month ago

Apply

5.0 - 10.0 years

10 - 15 Lacs

Chennai, Bengaluru

Work from Office

Job Description: Job Title: Data Engineer Experience: 5-8 Years location: Chennai, Bangalore Employment Type: Full Time. Job Type: Work from Office (Mon-Fri) Shift Timing: 12:30 PM to 9:30 PM Required Skills: Strong Financial Servicies (prefered Banking) experience, Translate Finacial and accounting concepts into business and systems requirements, data analysis, identify data anomolies and providie remediation options, data mapping, strong data base design concepts, good familarity with SQL, assist in the creation of meta data, data lineage and data flow diagrams, support UAT planning and execution functions.

Posted 1 month ago

Apply

3.0 - 5.0 years

5 - 8 Lacs

Chennai

Work from Office

" PLEASE READ THE JOB DESTCRIPTION AND APPLY" Data Engineer Job Description Position Overview Yesterday is history, tomorrow is a mystery, but today is a gift. That's why we call it the present. - Master Oogway Join CustomerLabs' dynamic data team as a Data Engineer and play a pivotal role in transforming raw marketing data into actionable insights that power our digital marketing platform. As a key member of our data infrastructure team, you will design, develop, and maintain robust data pipelines, data warehouses, and analytics platforms that serve as the backbone of our digital marketing product development. Sometimes the hardest choices require the strongest wills. - Thanos (but we promise, our data decisions are much easier! ) In this role, you will collaborate with cross-functional teams including Data Scientists, Product Managers, and Marketing Technology specialists to ensure seamless data flow from various marketing channels, ad platforms, and customer touchpoints to our analytics dashboards and reporting systems. You'll be responsible for building scalable, reliable, and efficient data solutions that can handle high-volume marketing data processing and real-time campaign analytics. What You'll Do: - Design and implement enterprise-grade data pipelines for marketing data ingestion and processing - Build and optimize data warehouses and data lakes to support digital marketing analytics - Ensure data quality, security, and compliance across all marketing data systems - Create data models and schemas that support marketing attribution, customer journey analysis, and campaign performance tracking - Develop monitoring and alerting systems to maintain data pipeline reliability for critical marketing operations - Collaborate with product teams to understand digital marketing requirements and translate them into technical solutions Why This Role Matters: I can do this all day. - Captain America (and you'll want to, because this role is that rewarding!) You'll be the backbone behind the data infrastructure that powers CustomerLabs' digital marketing platform, making marketers' lives easier and better. Your work directly translates to smarter automation, clearer insights, and more successful campaigns - helping marketers focus on what they do best while we handle the complex data heavy lifting. Sometimes you gotta run before you can walk. - Iron Man (and sometimes you gotta build the data pipeline before you can analyze the data! ) Our Philosophy: We believe in the power of data to transform lives, just like the Dragon Warrior transformed the Valley of Peace. Every line of code you write, every pipeline you build, and every insight you enable has the potential to change how marketers work and succeed. We're not just building data systems - we're building the future of digital marketing, one insight at a time. Your story may not have such a happy beginning, but that doesn't make you who you are. It is the rest of your story, who you choose to be. - Soothsayer What Makes You Special: We're looking for someone who embodies the spirit of both Captain America's unwavering dedication and Iron Man's innovative genius. You'll need the patience to build robust systems (like Cap's shield ) and the creativity to solve complex problems (like Tony's suit). Most importantly, you'll have the heart to make a real difference in marketers' lives. Inner peace... Inner peace... Inner peace... - Po (because we know data engineering can be challenging, but we've got your back! ) Key Responsibilities Data Pipeline Development - Design, build, and maintain robust, scalable data pipelines and ETL/ELT processes - Develop data ingestion frameworks to collect data from various sources (databases, APIs, files, streaming sources) - Implement data transformation and cleaning processes to ensure data quality and consistency - Optimize data pipeline performance and reliability Data Infrastructure Management - Design and implement data warehouse architectures - Manage and optimize database systems (SQL and NoSQL) - Implement data lake solutions and data governance frameworks - Ensure data security, privacy, and compliance with regulatory requirements Data Modeling and Architecture - Design and implement data models for analytics and reporting - Create and maintain data dictionaries and documentation - Develop data schemas and database structures - Implement data versioning and lineage tracking Data Quality, Security, and Compliance - Ensure data quality, integrity, and consistency across all marketing data systems - Implement and monitor data security measures to protect sensitive information - Ensure privacy and compliance with regulatory requirements (e.g., GDPR, CCPA) - Develop and enforce data governance policies and best practices Collaboration and Support - Work closely with Data Scientists, Analysts, and Business stakeholders - Provide technical support for data-related issues and queries Monitoring and Maintenance - Implement monitoring and alerting systems for data pipelines - Perform regular maintenance and optimization of data systems - Troubleshoot and resolve data pipeline issues - Conduct performance tuning and capacity planning Required Qualifications Experience - 2+ years of experience in data engineering or related roles - Proven experience with ETL/ELT pipeline development - Experience with cloud data platform (GCP) - Experience with big data technologies Technical Skills - Programming Languages : Python, SQL, Golang (preferred) - Databases: PostgreSQL, MySQL, Redis - Big Data Tools: Apache Spark, Apache Kafka, Apache Airflow, DBT, Dataform - Cloud Platforms: GCP (BigQuery, Dataflow, Cloud run, Cloud SQL, Cloud Storage, Pub/Sub, App Engine, Compute Engine etc.) - Data Warehousing: Google BigQuery - Data Visualization: Superset, Looker, Metabase, Tableau - Version Control: Git, GitHub - Containerization: Docker Soft Skills - Strong problem-solving and analytical thinking - Excellent communication and collaboration skills - Ability to work independently and in team environments - Strong attention to detail and data quality - Continuous learning mindset Preferred Qualifications Additional Experience - Experience with real-time data processing and streaming - Knowledge of machine learning pipelines and MLOps - Experience with data governance and data catalog tools - Familiarity with business intelligence tools (Tableau, Power BI, Looker, etc.) - Experience using AI-powered tools (such as Cursor, Claude, Copilot, ChatGPT, Gemini, etc.) to accelerate coding, automate tasks, or assist in system design ( We belive run with machine, not against machine ) Interview Process 1. Initial Screening: Phone/video call with HR 2. Technical Interview: Deep dive into data engineering concepts 3. Final Interview: Discussion with senior leadership Note: This job description is intended to provide a general overview of the position and may be modified based on organizational needs and candidate qualifications. Our Team Culture We are Groot. - We work together, we grow together, we succeed together. We believe in: - Innovation First - Like Iron Man, we're always pushing the boundaries of what's possible - Team Over Individual - Like the Avengers, we're stronger together than apart - Continuous Learning - Like Po learning Kung Fu, we're always evolving and improving - Making a Difference - Like Captain America, we fight for what's right (in this case, better marketing!) Growth Journey There is no charge for awesomeness... or attractiveness. - Po Your journey with us will be like Po's transformation from noodle maker to Dragon Warrior: - Level 1 : Master the basics of our data infrastructure - Level 2: Build and optimize data pipelines - Level 3 : Lead complex data projects and mentor others - Level 4: Become a data engineering legend (with your own theme music! ) What We Promise I am Iron Man. - We promise you'll feel like a superhero every day! - Work that matters - Every pipeline you build helps real marketers succeed - Growth opportunities - Learn new technologies and advance your career - Supportive team - We've got your back, just like the Avengers - Work-life balance - Because even superheroes need rest!

Posted 1 month ago

Apply

3.0 - 8.0 years

10 - 20 Lacs

Noida, New Delhi, Gurugram

Hybrid

Role & responsibilities Strategically partner with the Customer Cloud Sales Team to identify and qualify business opportunities and identify key customer technical objections. Develop strategies to resolve technical obstacles and architect client solutions to meet complex business and technical requirements Lead the technical aspects of the sales cycle, including technical trainings, client presentations, technical bid responses, product and solution briefings, and proof-of-concept technical work Identify and respond to key technical objections from client, providing prescriptive guidance for successful resolutions tailored to specific client needs May directly work with Customer's Cloud products to demonstrate, design and prototype integrations in customer/partner environments Develop and deliver thorough product messaging to highlight advanced technical value propositions, using techniques such as: whiteboard and slide presentations, technical product demonstrations, white papers, trial management and RFI response documents Assess technical challenges to develop and deliver recommendations on integration strategies, enterprise architectures, platforms and application infrastructure required to successfully implement a complete solution Leverage technical expertise to provide best practice counsel to optimize advanced technical products effectiveness THER CRITICAL FUNCTIONS AND RESPONSIBILTIES Ensure customer data is accurate and actionable using Salesforce.com (SFDC) systems Leverage 3rd party prospect and account intelligence tools to extract meaningful insights and support varying client needs Navigate, analyse and interpret technical documentation for technical products, often including Customer Cloud products Enhance skills and knowledge by using a Learning Management Solution (LMS) for training and certification Serve as a technical and subject matter expert to support advanced trainings for team members on moderate to highly complex technical subjects Offer thought leadership in the advanced technical solutions, such as cloud computing Coach and mentor team members and advise managers on creating business and process efficiencies in internal workflows and training materials Collect and codify best practices between sales, marketing, and sales engineers Preferred candidate profile Required Qualifications Bachelors degree in Computer Science or other technical field, or equivalent practical experience (preferred) 3-5 years of experience serving in a technical Sales Engineer in an advanced technical environment Prior experience with advanced technologies, such as: Big Data, PaaS, and IaaS technologies, etc. Proven strong communication skills with a proactive and positive approach to task management (written and verbal)Confident presenter with excellent presentation and persuasion skills Strong work ethic and ability to work independently Perks and benefits

Posted 1 month ago

Apply

7.0 - 12.0 years

5 - 9 Lacs

Hyderabad, Bengaluru

Work from Office

Notice : Immediate to 60 days. The primary role of the Business Analyst is to effectively support the business and project teams on requirements gathering, designing, analysing and documentation writing that contributes to the development of optimal solutions. Candidate should be able mentor a set of business analysts, review and enhance the solutions offered by them.Working with our product or client organizations to ensure we retain core integrity in the solution, and that the enhancements, applications and localizations that you design fit with our overall strategy. Providing quality deliverables across the following key work elements: Requirements elicitation Use Case Definition Process mapping Creation of data flow diagrams, BPMN diagrams, workflow diagrams Functional Design definition and Test Scope Definition Candidate Requirements To succeed in this role, will need: An ability to look holistically across technology, people, process and data when defining solutions A requirement-driven (not solution-driven) attitude to Business Analysis Great people skills you will be working with a variety of technical, business and product personnel at junior, mid and senior levels of the organization High quality standards we pride ourselves on the accuracy and quality of our delivery An interest in technology and the payments industry We are looking for bright, highly motivated, and ambitious Business Analysts to join our team. The position involves eliciting and documenting business requirements, working directly with product teams and clients, as well as seeing each project through development, testing and implementation. The successful candidate will be expected to work in a professional manner alongside other teams as well as on their own. Following skills will be required: An ability to look holistically across technology, people, process and data when defining solutions. A requirement-driven (not solution-driven) attitude to Business Analysis Great people skills you will be working with a variety of technical, business and product personnel at junior, mid and senior levels of the organization. High quality standards we pride ourselves on the accuracy and quality of our delivery An interest in technology and the payments industry. The candidates must: Have demonstrated experience in Business Analysis (product/solution definition preferred) Have passion for business solutions. Be a self-starter, and work well in a team Be fluent in English, both written and spoken Have the following technical skills: Proficient Microsoft Office User (Word, Excel, PowerPoint, Visio) Proficient in the use of Confluence and Jira Agile delivery tools Proficient in the use of Microsoft Vision Have managed a team of 3+ Business Analysts

Posted 1 month ago

Apply

8.0 - 12.0 years

22 - 32 Lacs

Noida, Pune, Bengaluru

Hybrid

Build and Optimize ELT/ETL Pipelines using BigQuery, GCS, Dataflow, PubSub and Orchestration services Composer/Airflow • Hands-On experience in building ETL/ELT Pipelines with developing software code in Python • Experience in working with data warehouses, data warehouse technical architectures, reporting/analytic tools • Develop and implement data quality and governance procedures to ensure the accuracy and reliability of data • Demonstrate extensive skills and success in the implementation of technology projects within a professional environment, with a particular focus on data engineering • Eager to learn and explore new services within GCP to enhance skills and contribution to Projects • Demonstrated excellent communication, presentation, and problem-solving skills. • Prior Experience in ETL tool such as DBT,Talend Etc Good to have skills • AI/ML,Gen AI Backgroud • IAM, Cloud Logging and Monitoring • The Data Engineer coaches the junior data engineering personnel position by bringing them up to speed and help them get better understanding of overall Data ecosystem. • Working Experience with Agile methodologies and CI/CD Tools like Terraform/Jenkins • Working on Solution deck, IP build, client meetings on requirement gathering

Posted 1 month ago

Apply

5.0 - 10.0 years

8 - 18 Lacs

Hyderabad

Work from Office

Role: GCP Data Engineer Location: Hyderabad Duration: Full time Roles & Responsibilities: * Design, develop, and maintain scalable and reliable data pipelines using Apache Airflow to orchestrate complex workflows. * Utilize Google BigQuery for large-scale data warehousing, analysis, and querying of structured and semi-structured data. * Leverage the Google Cloud Platform (GCP) ecosystem, including services like Cloud Storage, Compute Engine, AI Platform, and Dataflow, to build and deploy data science solutions. * Develop, train, and deploy machine learning models to solve business problems such as forecasting, customer segmentation, and recommendation systems. * Write clean, efficient, and well-documented code in Python for data analysis, modeling, and automation. * Use Docker to containerize applications and create reproducible research environments, ensuring consistency across development, testing, and production. * Perform exploratory data analysis to identify trends, patterns, and anomalies, and effectively communicate findings to both technical and non-technical audiences. * Collaborate with data engineers to ensure data quality and integrity. * Stay current with the latest advancements in data science, machine learning, and big data technologies.

Posted 1 month ago

Apply

6.0 - 10.0 years

0 Lacs

chennai, tamil nadu

On-site

You are an experienced Architect with 6 to 8 years of expertise in Workday technologies, specifically Workday Studio and Workday Integrations. In this hybrid role, you will be responsible for designing and implementing effective solutions to meet business needs, ensuring seamless integration and functionality within systems. Your responsibilities will include designing and implementing Workday solutions aligned with business requirements, collaborating with cross-functional teams to gather and analyze integration requirements, developing and maintaining Workday Studio integrations for smooth data flow, providing technical expertise on Workday best practices, conducting thorough testing and validation of integrations, troubleshooting and resolving integration issues, documenting integration processes, staying updated with Workday features, optimizing existing integrations, communicating effectively with stakeholders, ensuring compliance with company policies and industry standards, contributing to continuous improvement of integration processes, and supporting the IT strategy by aligning Workday solutions with organizational goals. To qualify for this role, you should possess a strong understanding of Workday Studio and integration projects, demonstrate proficiency in developing and managing Workday Integrations, exhibit excellent problem-solving skills and attention to detail, have a proven track record of successful Workday integration implementations, show ability to work collaboratively in a hybrid work model, display strong communication skills, and maintain a proactive approach to learning and adapting to new technologies.,

Posted 1 month ago

Apply

10.0 - 15.0 years

0 Lacs

noida, uttar pradesh

On-site

You are an experienced OCI AI Architect who will be responsible for leading the design and deployment of Gen AI, Agentic AI, and traditional AI/ML solutions on Oracle Cloud. Your role will involve a deep understanding of Oracle Cloud Architecture, Gen AI, Agentic and AI/ML frameworks, data engineering, and OCI-native services. The ideal candidate will possess a combination of deep technical expertise in AI/ML and Gen AI over OCI along with domain knowledge in Finance and Accounting. Your key responsibilities will include designing, architecting, and deploying AI/ML and Gen AI solutions on OCI using native AI services, building agentic AI solutions using frameworks such as LangGraph, CrewAI, and AutoGen, leading the development of machine learning AI/ML pipelines, and providing technical guidance on MLOps, model versioning, deployment automation, and AI governance. You will collaborate with functional SMEs, application teams, and business stakeholders to identify AI opportunities, advocate for OCI-native capabilities, and support customer presentations and solution demos. To excel in this role, you should have 10-15 years of experience in Oracle Cloud and AI, with at least 5 years of proven experience in designing, architecting, and deploying AI/ML & Gen AI solutions over OCI AI stack. Strong Python development experience, knowledge of LLMs such as Cohere and GPT, proficiency in AI/ML/Gen AI frameworks like TensorFlow, PyTorch, Hugging Face, and hands-on experience with OCI services are required. Additionally, skills in AI governance, Agentic AI frameworks, AI architecture principles, and leadership abilities are crucial for success. Qualifications for this position include Oracle Cloud certifications such as OCI Architect Professional, OCI Generative AI Professional, OCI Data Science Professional, as well as a degree in Computer Science or MCA. Any degree or diploma in AI would be preferred. Experience with front-end programming languages, Finance domain solutions, Oracle Cloud deployment, and knowledge of Analytics and Data Science would be advantageous. If you are a highly skilled and experienced OCI AI Architect with a passion for designing cutting-edge AI solutions on Oracle Cloud, we invite you to apply and join our team for this exciting opportunity.,

Posted 1 month ago

Apply

5.0 - 9.0 years

0 Lacs

chennai, tamil nadu

On-site

As a Software Engineer Practitioner at TekWissen in Chennai, you will be a crucial part of the team responsible for the development and maintenance of the Enterprise Data Platform. Your main focus will be on designing, building, and optimizing scalable data pipelines within the Google Cloud Platform (GCP) environment. Working with GCP Native technologies such as BigQuery, Dataform, Dataflow, and Pub/Sub, you will ensure data governance, security, and optimal performance. This role offers you the opportunity to utilize your full-stack expertise, collaborate with talented teams, and establish best practices for data engineering at the client. To be successful in this role, you should possess a Bachelor's or Master's degree in Computer Science, Engineering, or a related field of study. You should have at least 5 years of experience with a strong understanding of database concepts and multiple database technologies to optimize query and data processing performance. Proficiency in SQL, Python, and Java is essential, along with experience in programming engineering transformations in Python or similar languages. Additionally, you should have the ability to work effectively across different organizations, product teams, and business partners, along with knowledge of Agile (Scrum) methodology and experience in writing user stories. Your skills should include expertise in data architecture, data warehousing, and Google Cloud Platform tools such as BigQuery, Data Flow, Dataproc, Data Fusion, and others. Experience with Data Warehouse concepts, ETL processes, and data service ecosystems is crucial for this role. Strong communication skills are necessary for both internal team collaboration and external stakeholder interactions. Your role will involve advocating for user experience through empathetic stakeholder relationships and ensuring effective communication within the team and with stakeholders. As a Software Engineer Practitioner, you should have excellent communication, collaboration, and influence skills to energize the team. Your knowledge of data, software, architecture operations, data engineering, and data management standards will be valuable in this role. Hands-on experience in Python using libraries like NumPy and Pandas is required, along with extensive knowledge of GCP offerings and bundled services related to data operations. You should also have experience in re-developing and optimizing data operations, data science, and analytical workflows and products. TekWissen Group is an equal opportunity employer that supports workforce diversity, and we encourage applicants from diverse backgrounds to apply. Join us in shaping the future of data engineering and making a positive impact on lives, communities, and the planet.,

Posted 1 month ago

Apply

5.0 - 8.0 years

6 - 10 Lacs

Pune

Hybrid

Mandatory Skills: Cloud-PaaS-GCP-Google Cloud Platform . Location: Wipro PAN India Hybrid 3 days in Wipro office JD: Strong - SQL Strong - Python Any cloud technology (AWS, azure, GCP etc) have to be excellent GCP (preferred) PySpark (preferred) Essential Skills: Proficiency in Cloud-PaaS-GCP-Google Cloud Platform. Experience Required: 5-8 years. Position: Cloud Data Engineer. Work Location: Wipro, PAN India. Work Arrangement: Hybrid model with 3 days in Wipro office. Additional Experience: 8-13 years. Job Description: - Strong expertise in SQL. - Proficient in Python. - Excellent knowledge of any cloud technology (AWS, Azure, GCP, etc.), with a preference for GCP. - Familiarity with PySpark is preferred. Mandatory Skills: Cloud-PaaS-GCP-Google Cloud Platform . JD: Strong - SQL Strong - Python Any cloud technology (AWS, azure, GCP etc) have to be excellent GCP (preferred) PySpark (preferred

Posted 1 month ago

Apply

4.0 - 9.0 years

9 - 13 Lacs

Bengaluru

Work from Office

Artificial Intelligence Machine learning. Mandatory : Deep understanding of AI and machine learning algorithms, including large language models (LLMs). Like Azure OpenAI, Mystral proficiency in deployment of LLM on On Prem or local Server Good to have : Integration with , Gemini ,Knowledge of Agentic AI and Agentic RAG Cloud ComputingMandatory : Proficiency in cloud platforms like Azure for deploying AI solutions and using existing AI capability of cloud platform. Good understanding of Azure Container Registry, Azure Container Instance and App Services Good to have : Proficiency in cloud platforms like AWS, Google Cloud for deploying AI solutions Programming SkillsMandatory : Strong coding skills in languages such as Python. Fast API, Flask API creation and integration of API from third party tools like SNOW, Sharepoint. Strong skills in UI development in Angular, Streamlit, HTML, CSS, Java script Expertise in TensorFlow, PyTorch, and relevant AI/ML libraries to fine-tune models Good to have : Integration with monitoring tools (e.g., Solarwinds, Splunk, NewRelic) Data Engineering Mandatory : Data flow diagram and data security at rest and motion. Good to have : Collaboration with team to analyze business data and understand to implement the solution and ensure data security and compliance Solution ArchitectureMandatory : Ability to design and implement scalable AI, GenAI solutions tailored to business needs. Architecture, design development and walkthrough with customer. Designing Hardware and Software requirement to deploy GenAI solution for On-Prem or Cloud. Good understanding of Multi factor Authentication. Good to have: Cost estimation for GenAI solution deployment on Cloud Platform.

Posted 1 month ago

Apply

1.0 - 2.0 years

3 - 4 Lacs

Gurugram, Bengaluru

Work from Office

About the Role: Grade Level (for internal use): 08 S&P Global Mobility The Role: Data Engineer The Team We are the Research and Modeling team, driving innovation by building robust models and tools to support the Vehicle & Powertrain Forecast team. Our work includes all aspects of development of, and ongoing support for, our business line data flows, analyst modelling solutions and forecasts, new apps, new client-facing products, and many other work areas besides. We value ownership, adaptability, and a passion for learning, while fostering an environment where diverse perspectives and mentorship fuel continuous growth. The Impact We areseekinga motivated and talented Data Engineer to be a key player in building a robust data infrastructure and flows that supports our advanced forecasting models. Your initial focus will be to create a robust data factory to ensure smooth collection and refresh of actual data, a critical component that feeds our forecast. Additionally, you will be assisting in developing mathematical models and supporting the work of ML engineers and data scientists. Your work will significantly impact our ability to deliver timely and insightful forecasts to our clients. Whats in it for you: Opportunity to build foundational data infrastructure that directly impacts advanced forecasting models and client delivery. Gain exposure to and support the development of sophisticated mathematical models, Machine Learning, and Data Science applications. Contribute significantly to delivering timely and insightful forecasts, influencing client decisions in the automotive sector. Work in a collaborative environment that fosters continuous learning, mentorship, and professional growth in data engineering and related analytical fields. Responsibilities Data Pipeline Development: Design, build, and maintain scalable and reliable data pipelines for efficient data ingestion, processing, and storage, primarily focusing on creating a data factory for our core forecasting data. Data Quality and Integrity: Implement robust data quality checks and validation processes to ensure the accuracy and consistency of data used in our forecasting models. Mathematical Model Support: Collaborate with other data engineers to develop and refine mathematical logics and models that underpin our forecasting methodologies. ML and Data Science Support Provide data support to our Machine Learning Engineers and Data Scientists. Collaboration and Communication: Work closely with analysts, developers, and other stakeholders to understand data requirements and deliver effective solutions. Innovation and Improvement: Continuously explore and evaluate new technologies and methodologies to enhance our data infrastructure and forecasting capabilities. What Were Looking For: Bachelor's or Master's degree in Computer Science, Data Engineering, or a related field. Minimum of 1 - 2 years of experience in data engineering, with a proven track record of building and maintaining data pipelines. Strong proficiency in SQL and experience with relational and non-relational databases. Strong Python programming skills, with experience in data manipulation and processing libraries (e.g., Pandas, NumPy). Experience with mathematical modelling and supporting ML and data science teams. Experience with cloud platforms (e.g., AWS, Azure, GCP) and cloud-based data services. Strong communication and collaboration skills, with the ability to work effectively in a team environment. Experience in the automotive sector is a plus. Statement: S&P Global delivers essential intelligence that powers decision making. We provide the worlds leading organizations with the right data, connected technologies and expertise they need to move ahead. As part of our team, youll help solve complex challenges that equip businesses, governments and individuals with the knowledge to adapt to a changing economic landscape. S&P Global Mobility turns invaluable insights captured from automotive data to help our clients understand todays market, reach more customers, and shape the future of automotive mobility. About S&P Global Mobility At S&P Global Mobility, we provide invaluable insights derived from unmatched automotive data, enabling our customers to anticipate change and make decisions with conviction. Our expertise helps them to optimize their businesses, reach the right consumers, and shape the future of mobility. We open the door to automotive innovation, revealing the buying patterns of today and helping customers plan for the emerging technologies of tomorrow. For more information, visit www.spglobal.com/mobility . Whats In It For You Our Purpose: Progress is not a self-starter. It requires a catalyst to be set in motion. Information, imagination, people, technologythe right combination can unlock possibility and change the world.Our world is in transition and getting more complex by the day. We push past expected observations and seek out new levels of understanding so that we can help companies, governments and individuals make an impact on tomorrow. At S&P Global we transform data into Essential Intelligence, pinpointing risks and opening possibilities. We Accelerate Progress. Our People: Our Values: Integrity, Discovery, Partnership At S&P Global, we focus on Powering Global Markets. Throughout our history, the world's leading organizations have relied on us for the Essential Intelligence they need to make confident decisions about the road ahead. We start with a foundation of integrity in all we do, bring a spirit of discovery to our work, and collaborate in close partnership with each other and our customers to achieve shared goals. Benefits: We take care of you, so you cantake care of business. We care about our people. Thats why we provide everything youand your careerneed to thrive at S&P Global. Health & WellnessHealth care coverage designed for the mind and body. Continuous LearningAccess a wealth of resources to grow your career and learn valuable new skills. Invest in Your FutureSecure your financial future through competitive pay, retirement planning, a continuing education program with a company-matched student loan contribution, and financial wellness programs. Family Friendly PerksIts not just about you. S&P Global has perks for your partners and little ones, too, with some best-in class benefits for families. Beyond the BasicsFrom retail discounts to referral incentive awardssmall perks can make a big difference. For more information on benefits by country visithttps://spgbenefits.com/benefit-summaries Global Hiring and Opportunity at S&P Global: At S&P Global, we are committed to fostering a connected andengaged workplace where all individuals have access to opportunities based on their skills, experience, and contributions. Our hiring practices emphasize fairness, transparency, and merit, ensuring that we attract and retain top talent. By valuing different perspectives and promoting a culture of respect and collaboration, we drive innovation and power global markets. Recruitment Fraud Alert If you receive an email from a spglobalind.com domain or any other regionally based domains, it is a scam and should be reported to reportfraud@spglobal.com. S&P Global never requires any candidate to pay money for job applications, interviews, offer letters, pre-employment training or for equipment/delivery of equipment. Stay informed and protect yourself from recruitment fraud by reviewing our guidelines, fraudulent domains, and how to report suspicious activity here. ---- Equal Opportunity Employer S&P Global is an equal opportunity employer and all qualified candidates will receive consideration for employment without regard to race/ethnicity, color, religion, sex, sexual orientation, gender identity, national origin, age, disability, marital status, military veteran status, unemployment status, or any other status protected by law. Only electronic job submissions will be considered for employment. If you need an accommodation during the application process due to a disability, please send an email to EEO.Compliance@spglobal.com and your request will be forwarded to the appropriate person. US Candidates Only The EEO is the Law Poster http://www.dol.gov/ofccp/regs/compliance/posters/pdf/eeopost.pdf describes discrimination protections under federal law. Pay Transparency Nondiscrimination Provision - https://www.dol.gov/sites/dolgov/files/ofccp/pdf/pay-transp_%20English_formattedESQA508c.pdf ---- 203 - Entry Professional (EEO Job Group) (inactive), 20 - Professional (EEO-2 Job Categories-United States of America), IFTECH203 - Entry Professional (EEO Job Group)

Posted 1 month ago

Apply

4.0 - 8.0 years

17 - 20 Lacs

Bengaluru

Work from Office

Key Responsibilities: Collaborate with the Creative Director to develop and execute creative concepts and design strategies. Lead and mentor a team of designers, providing guidance, feedback, and support to elevate the quality of design output. Oversee the creative process from concept to final delivery, ensuring projects meet brand standards, deadlines, and budgets. Work closely with cross-functional teams including marketing, product, and client services to translate business goals into compelling visual narratives. Present design concepts and ideas to clients and internal stakeholders with clarity and confidence. Stay updated on industry trends, tools, and technologies to drive innovation and best practices within the team. Manage multiple projects simultaneously while maintaining high standards of creativity and quality. Foster a collaborative and inspiring work environment that encourages creativity and professional growth.

Posted 1 month ago

Apply

4.0 - 9.0 years

6 - 11 Lacs

Chennai

Work from Office

Job Summary Synechron is seeking an experienced Data Processing Engineer to lead the development of large-scale data processing solutions using Java, Apache Flink/Storm/Beam, and Google Cloud Platform (GCP). In this role, you will collaborate across teams to design, develop, and optimize data-intensive applications that support strategic business objectives. Your expertise will help evolve our data architecture, improve processing efficiency, and ensure the delivery of reliable, scalable solutions in an Agile environment. Software Requirements Required: Java (version 8 or higher) Apache Flink, Storm, or Beam for streaming data processing Google Cloud Platform (GCP) services, especially BigQuery and related data tools Experience with databases such as BigQuery, Oracle, or equivalent Familiarity with version control tools such as Git Preferred: Cloud deployment experience with GCP in particular Additional familiarity with containerization (Docker/Kubernetes) Knowledge of CI/CD pipelines and DevOps practices Overall Responsibilities Collaborate closely with cross-functional teams to understand data and system requirements, then design scalable solutions aligned with business needs. Develop detailed technical specifications, implementation plans, and documentation for new features and enhancements. Implement, test, and deploy data processing applications using Java and Apache Flink/Storm/Beam within GCP environments. Conduct code reviews to ensure quality, security, and maintainability, supporting team members' growth and best practices. Troubleshoot technical issues, resolve bottlenecks, and optimize application performance and resource utilization. Stay current with advancements in data processing, cloud technology, and Java development to continuously improve solutions. Support testing teams to verify data workflows and validation processes, ensuring reliability and accuracy. Participate in Agile ceremonies, including sprint planning, stand-ups, and retrospectives to ensure continuous delivery and process improvement. Technical Skills (By Category) Programming Languages: Required: Java (8+) Preferred: Python, Scala, or Node.js for scripting or auxiliary processing Databases/Data Management: Experience with BigQuery, Oracle, or similar relational data stores Cloud Technologies: GCP (BigQuery, Cloud Storage, Dataflow etc.) with hands-on experience in cloud data solutions Frameworks and Libraries: Apache Flink, Storm, or Beam for stream processing Java SDKs, APIs, and data integration libraries Development Tools and Methodologies: Git, Jenkins, JIRA, and Agile/Scrum practices Familiarity with containerization (Docker, Kubernetes) is a plus Security and Compliance: Understanding of data security principles in cloud environments Experience Requirements 4+ years of experience in software development, with a focus on data processing and Java-based backend development Proven experience working with Apache Flink, Storm, or Beam in production environments Strong background in managing large data workflows and pipeline optimization Experience with GCP data services and cloud-native development Demonstrated success in Agile projects, including collaboration with cross-functional teams Previous leadership or mentorship experience is a plus Day-to-Day Activities Design, develop, and deploy scalable data processing applications in Java using Flink/Storm/Beam on GCP Collaborate with data engineers, analysts, and architects to translate business needs into technical solutions Conduct code reviews, optimize data pipelines, and troubleshoot system issues swiftly Document technical specifications, data schemas, and process workflows Participate actively in Agile ceremonies, provide updates on task progress, and suggest process improvements Support continuous integration and deployment of data applications Mentor junior team members, sharing best practices and technical insights Qualifications Bachelors or Masters degree in Computer Science, Information Technology, or equivalent Relevant certifications in cloud technologies or data processing (preferred) Evidence of continuous professional development and staying current with industry trends Professional Competencies Strong analytical and problem-solving skills focused on data processing challenges Leadership abilities to guide, mentor, and develop team members Excellent communication skills for technical documentation and stakeholder engagement Adaptability to rapidly changing technologies and project priorities Capacity to prioritize tasks and manage time efficiently under tight deadlines Innovative mindset to leverage new tools and techniques for performance improvements

Posted 1 month ago

Apply

4.0 - 7.0 years

4 - 8 Lacs

Hyderabad

Work from Office

Role : Business Analyst - Cards Issuing Location : Bangalore Experience : 7 yrs till 14 yrs Notice : Immediate to 60 days. Role : Business Analyst - Cards IssuingLocation : BangaloreExperience : 7 yrs till 14 yrsNotice : Immediate to 60 days.The primary role of the Business Analyst is to effectively support the business and project teams on requirements gathering, designing, analysing and documentation writing that contributes to the development of optimal solutions. Candidate should be able mentor a set of business analysts, review and enhance the solutions offered by them.Working with our product or client organizations to ensure we retain core integrity in the solution, and that the enhancements, applications and localizations that you design fit with our overall strategy.Providing quality deliverables across the following key work elements: Requirements elicitation Use Case Definition Process mapping Creation of data flow diagrams, BPMN diagrams, workflow diagrams Functional Design definition and Test Scope Definition Candidate Requirements To succeed in this role, will need: An ability to look holistically across technology, people, process and data when defining solutions A requirement-driven (not solution-driven) attitude to Business Analysis Great people skills you will be working with a variety of technical, business and product personnel at junior, mid and senior levels of the organization High quality standards we pride ourselves on the accuracy and quality of our delivery An interest in technology and the payments industry We are looking for bright, highly motivated, and ambitious Business Analysts to join our team. The position involves eliciting and documenting business requirements, working directly with product teams and clients, as well as seeing each project through development, testing and implementation. The successful candidate will be expected to work in a professional manner alongside other teams as well as on their own. Following skills will be required: An ability to look holistically across technology, people, process and data when defining solutions. A requirement-driven (not solution-driven) attitude to Business Analysis Great people skills you will be working with a variety of technical, business and product personnel at junior, mid and senior levels of the organization. High quality standards we pride ourselves on the accuracy and quality of our delivery An interest in technology and the payments industry. The candidates must: Have demonstrated experience in Business Analysis (product/solution definition preferred) Have passion for business solutions. Be a self-starter, and work well in a team Be fluent in English, both written and spoken Have the following technical skills: Proficient Microsoft Office User (Word, Excel, PowerPoint, Visio) Proficient in the use of Confluence and Jira Agile delivery tools Proficient in the use of Microsoft Vision Have managed a team of 3+ Business Analysts Skills (competencies)

Posted 2 months ago

Apply

4.0 - 9.0 years

12 - 17 Lacs

Chennai

Work from Office

Job Summary Synechron is seeking an experienced Data Processing Engineer to lead the development of large-scale data processing solutions using Java, Apache Flink/Storm/Beam, and Google Cloud Platform (GCP). In this role, you will collaborate across teams to design, develop, and optimize data-intensive applications that support strategic business objectives. Your expertise will help evolve our data architecture, improve processing efficiency, and ensure the delivery of reliable, scalable solutions in an Agile environment. Software Requirements Required: Java (version 8 or higher) Apache Flink, Storm, or Beam for streaming data processing Google Cloud Platform (GCP) services, especially BigQuery and related data tools Experience with databases such as BigQuery, Oracle, or equivalent Familiarity with version control tools such as Git Preferred: Cloud deployment experience with GCP in particular Additional familiarity with containerization (Docker/Kubernetes) Knowledge of CI/CD pipelines and DevOps practices Overall Responsibilities Collaborate closely with cross-functional teams to understand data and system requirements, then design scalable solutions aligned with business needs. Develop detailed technical specifications, implementation plans, and documentation for new features and enhancements. Implement, test, and deploy data processing applications using Java and Apache Flink/Storm/Beam within GCP environments. Conduct code reviews to ensure quality, security, and maintainability, supporting team members' growth and best practices. Troubleshoot technical issues, resolve bottlenecks, and optimize application performance and resource utilization. Stay current with advancements in data processing, cloud technology, and Java development to continuously improve solutions. Support testing teams to verify data workflows and validation processes, ensuring reliability and accuracy. Participate in Agile ceremonies, including sprint planning, stand-ups, and retrospectives to ensure continuous delivery and process improvement. Technical Skills (By Category) Programming Languages: Required: Java (8+) Preferred: Python, Scala, or Node.js for scripting or auxiliary processing Databases/Data Management: Experience with BigQuery, Oracle, or similar relational data stores Cloud Technologies: GCP (BigQuery, Cloud Storage, Dataflow etc.) with hands-on experience in cloud data solutions Frameworks and Libraries: Apache Flink, Storm, or Beam for stream processing Java SDKs, APIs, and data integration libraries Development Tools and Methodologies: Git, Jenkins, JIRA, and Agile/Scrum practices Familiarity with containerization (Docker, Kubernetes) is a plus Security and Compliance: Understanding of data security principles in cloud environments Experience Requirements 4+ years of experience in software development, with a focus on data processing and Java-based backend development Proven experience working with Apache Flink, Storm, or Beam in production environments Strong background in managing large data workflows and pipeline optimization Experience with GCP data services and cloud-native development Demonstrated success in Agile projects, including collaboration with cross-functional teams Previous leadership or mentorship experience is a plus Day-to-Day Activities Design, develop, and deploy scalable data processing applications in Java using Flink/Storm/Beam on GCP Collaborate with data engineers, analysts, and architects to translate business needs into technical solutions Conduct code reviews, optimize data pipelines, and troubleshoot system issues swiftly Document technical specifications, data schemas, and process workflows Participate actively in Agile ceremonies, provide updates on task progress, and suggest process improvements Support continuous integration and deployment of data applications Mentor junior team members, sharing best practices and technical insights Qualifications Bachelors or Masters degree in Computer Science, Information Technology, or equivalent Relevant certifications in cloud technologies or data processing (preferred) Evidence of continuous professional development and staying current with industry trends Professional Competencies Strong analytical and problem-solving skills focused on data processing challenges Leadership abilities to guide, mentor, and develop team members Excellent communication skills for technical documentation and stakeholder engagement Adaptability to rapidly changing technologies and project priorities Capacity to prioritize tasks and manage time efficiently under tight deadlines Innovative mindset to leverage new tools and techniques for performance improvements S YNECHRONS DIVERSITY & INCLUSION STATEMENT Diversity & Inclusion are fundamental to our culture, and Synechron is proud to be an equal opportunity workplace and is an affirmative action employer. Our Diversity, Equity, and Inclusion (DEI) initiative Same Difference is committed to fostering an inclusive culture promoting equality, diversity and an environment that is respectful to all. We strongly believe that a diverse workforce helps build stronger, successful businesses as a global company. We encourage applicants from across diverse backgrounds, race, ethnicities, religion, age, marital status, gender, sexual orientations, or disabilities to apply. We empower our global workforce by offering flexible workplace arrangements, mentoring, internal mobility, learning and development programs, and more.

Posted 2 months ago

Apply

9.0 - 14.0 years

25 - 27 Lacs

Bengaluru

Work from Office

Job Area: Engineering Group, Engineering Group > Software Engineering General Summary: Qualcomm XR Research India is rapidly expanding to offer state of the art XR solutions. To scale and strengthen our offering in this domain, we are seeking a systems architect who will drive the next-generation technologies and architecture, shaping the future of Augmented Reality (AR), Virtual Reality (VR), and Mixed Reality (MR) use cases. Responsibilities: Your responsibilities will span across technical leadership, system architecture, software architecture, implementation and compute analysis. Here are the key aspects of your role: Drive technical workstreams related to perception, reprojection, split-processing and other XR technologies. Identify and deeply understand the use cases for AR/VR/MR applications. Collaborate with cross-functional teams to translate use case requirements into detailed implementation specifications. Define system and software architecture, considering hardware/software tradeoffs, compute and memory constraints. Optimize compute workload distribution across different subsystems on the SoC for efficient performance and power. Validate and optimize architecture definitions through system-level use case modeling. Prototype new use cases to understand the compute and memory requirements, and influence future software/hardware features and reference device specification. Minimum Qualifications: 9+ years of experience in systems engineering with a bachelors degree in electrical engineering, information systems, computer science, or related field. Hands-on experience in defining systems architecture and software design for multi-core architectures (CPUs, GPUs, DSPs, etc.), including performance analysis on heterogeneous architectures (core, multi-level cache, memory, etc.). Proficiency in documenting call flows and data flows for both software and hardware components. Strong communication skills and ability to work effectively in a team. Preferred Qualifications: 8+ years of experience in systems engineering with masters and/or PhD degree in electrical engineering, information systems, computer science. Proven expertise in AR/VR, computer vision, machine learning, perception, camera technology, graphics pipeline, hardware design, SoC architecture, HW/SW partitioning, and system modeling (power, performance). Proficiency in C++, and Object-Oriented SW design Minimum Qualifications: Bachelor's degree in Engineering, Information Systems, Computer Science, or related field and 4+ years of Software Engineering or related work experience. OR Master's degree in Engineering, Information Systems, Computer Science, or related field and 3+ years of Software Engineering or related work experience. OR PhD in Engineering, Information Systems, Computer Science, or related field and 2+ years of Software Engineering or related work experience. 2+ years of work experience with Programming Language such as C, C++, Java, Python, etc. Applicants Qualcomm is an equal opportunity employer. If you are an individual with a disability and need an accommodation during the application/hiring process, rest assured that Qualcomm is committed to providing an accessible process. You may e-mail disability-accomodations@qualcomm.com or call Qualcomm's toll-free number found here. Upon request, Qualcomm will provide reasonable accommodations to support individuals with disabilities to be able participate in the hiring process. Qualcomm is also committed to making our workplace accessible for individuals with disabilities. (Keep in mind that this email address is used to provide reasonable accommodations for individuals with disabilities. We will not respond here to requests for updates on applications or resume inquiries). Qualcomm expects its employees to abide by all applicable policies and procedures, including but not limited to security and other requirements regarding protection of Company confidential information and other confidential and/or proprietary information, to the extent those requirements are permissible under applicable law. To all Staffing and Recruiting Agencies Please do not forward resumes to our jobs alias, Qualcomm employees or any other company location. Qualcomm is not responsible for any fees related to unsolicited resumes/applications. If you would like more information about this role, please contact Qualcomm Careers.

Posted 2 months ago

Apply

15.0 - 20.0 years

13 - 18 Lacs

Bengaluru

Work from Office

Project Role : Application Architect Project Role Description : Provide functional and/or technical expertise to plan, analyze, define and support the delivery of future functional and technical capabilities for an application or group of applications. Assist in facilitating impact assessment efforts and in producing and reviewing estimates for client work requests. Must have skills : Manufacturing Operations Good to have skills : NAMinimum 15 year(s) of experience is required Educational Qualification : BE, BTECH Summary :As an Application Architect, you will provide functional and technical expertise to plan, analyze, define, and support the delivery of future functional and technical capabilities for an application or group of applications. Your typical day will involve collaborating with various teams to assess impacts, produce estimates for client work requests, and ensure that the applications meet the evolving needs of the organization. You will engage in discussions that shape the direction of application development and contribute to strategic decision-making processes, ensuring that all technical aspects align with business objectives and user requirements. Roles & Responsibilities:- Expected to be a Subject Matter Expert with deep knowledge and experience.- Should have influencing and advisory skills.- Responsible for team decisions.- Engage with multiple teams and contribute on key decisions.- Expected to provide solutions to problems that apply across multiple teams.- Facilitate workshops and discussions to gather requirements and feedback from stakeholders.- Mentor junior professionals and provide guidance on best practices in application architecture. Professional & Technical Skills: - Must To Have Skills: Proficiency in Manufacturing Operations.- Strong understanding of application development methodologies and frameworks.- Experience with system integration and data flow management.- Ability to analyze and optimize application performance.- Familiarity with cloud technologies and deployment strategies. Additional Information:- The candidate should have minimum 15 years of experience in Manufacturing Operations.- This position is based at our Bengaluru office.- A BE, BTECH is required. Qualification BE, BTECH

Posted 2 months ago

Apply

15.0 - 20.0 years

3 - 7 Lacs

Bengaluru

Work from Office

Project Role : Business and Integration Practitioner Project Role Description : Assists in documenting the integration strategy endpoints and data flows. Is familiar with the entire project life-cycle, including requirements analysis, coding, testing, deployment, and operations to ensure successful integration. Under the guidance of the Architect, ensure the integration strategy meets business goals. Must have skills : SAP Configure Price & Quote Good to have skills : NAMinimum 5 year(s) of experience is required Educational Qualification : 15 years full time education Summary :As a Business and Integration Practitioner, you will assist in documenting the integration strategy endpoints and data flows. Your typical day will involve collaborating with various teams to ensure that the integration strategy aligns with business objectives. You will engage in discussions to analyze requirements, participate in coding and testing activities, and contribute to deployment efforts. Your role will also include monitoring operations to ensure successful integration throughout the project life-cycle, all while working under the guidance of the Architect to meet the overall business goals effectively. Roles & Responsibilities:- Expected to be an SME, collaborate and manage the team to perform.- Responsible for team decisions.- Engage with multiple teams and contribute on key decisions.- Provide solutions to problems for their immediate team and across multiple teams.- Facilitate knowledge sharing sessions to enhance team capabilities and foster a collaborative environment.- Monitor project progress and provide regular updates to stakeholders to ensure alignment with business objectives. Professional & Technical Skills: - Must To Have Skills: Proficiency in SAP Configure Price & Quote.- Strong understanding of integration strategies and data flow documentation.- Experience with project life-cycle management, including requirements analysis and deployment.- Ability to collaborate effectively with cross-functional teams to achieve project goals.- Familiarity with testing methodologies to ensure quality and performance of integrations. Additional Information:- The candidate should have minimum 5 years of experience in SAP Configure Price & Quote.- This position is based at our Bengaluru office.- A 15 years full time education is required. Qualification 15 years full time education

Posted 2 months ago

Apply

15.0 - 20.0 years

3 - 7 Lacs

Bengaluru

Work from Office

Project Role : Application Support Engineer Project Role Description : Act as software detectives, provide a dynamic service identifying and solving issues within multiple components of critical business systems. Must have skills : SAP Callidus Configuration and Development Good to have skills : NAMinimum 7.5 year(s) of experience is required Educational Qualification : 15 years full time education Summary :As an Application Support Engineer, you will act as software detectives, providing a dynamic service that identifies and solves issues within multiple components of critical business systems. Your typical day will involve collaborating with various teams to troubleshoot and resolve software-related challenges, ensuring the seamless operation of essential applications that support business functions. You will engage in problem-solving activities, analyze system performance, and implement solutions to enhance system reliability and efficiency, all while maintaining a focus on delivering exceptional service to stakeholders. Roles & Responsibilities:- Expected to be an SME.- Collaborate and manage the team to perform.- Responsible for team decisions.- Engage with multiple teams and contribute on key decisions.- Provide solutions to problems for their immediate team and across multiple teams.- Facilitate knowledge sharing sessions to enhance team capabilities.- Monitor system performance and proactively address potential issues. Professional & Technical Skills: - Must To Have Skills: Proficiency in SAP Callidus Configuration and Development.- Strong understanding of software troubleshooting methodologies.- Experience with system integration and data flow analysis.- Familiarity with business process mapping and optimization.- Ability to work collaboratively in a team-oriented environment. Additional Information:- The candidate should have minimum 7.5 years of experience in SAP Callidus Configuration and Development.- This position is based at our Bengaluru office.- A 15 years full time education is required. Qualification 15 years full time education

Posted 2 months ago

Apply

15.0 - 20.0 years

4 - 8 Lacs

Bengaluru

Work from Office

Project Role : Data Engineer Project Role Description : Design, develop and maintain data solutions for data generation, collection, and processing. Create data pipelines, ensure data quality, and implement ETL (extract, transform and load) processes to migrate and deploy data across systems. Must have skills : OneTrust Privacy Management Good to have skills : NAMinimum 7.5 year(s) of experience is required Educational Qualification : 15 years full time education Summary :As a Data Engineer, you will design, develop, and maintain data solutions that facilitate data generation, collection, and processing. Your typical day will involve creating data pipelines, ensuring data quality, and implementing ETL processes to effectively migrate and deploy data across various systems. You will collaborate with cross-functional teams to understand their data needs and provide innovative solutions to enhance data accessibility and usability. Roles & Responsibilities:- Expected to be an SME.- Collaborate and manage the team to perform.- Responsible for team decisions.- Engage with multiple teams and contribute on key decisions.- Provide solutions to problems for their immediate team and across multiple teams.- Develop and optimize data pipelines to ensure efficient data flow and processing.- Monitor and troubleshoot data quality issues, implementing corrective actions as necessary. Professional & Technical Skills: - Must To Have Skills: Proficiency in OneTrust Privacy Management.- Good To Have Skills: Experience with Data Governance.- Strong understanding of data architecture and data modeling principles.- Experience with ETL tools and data integration techniques.- Familiarity with data quality frameworks and best practices. Additional Information:- The candidate should have minimum 7.5 years of experience in OneTrust Privacy Management.- This position is based at our Bengaluru office.- A 15 years full time education is required. Qualification 15 years full time education

Posted 2 months ago

Apply

3.0 - 8.0 years

5 - 9 Lacs

Navi Mumbai

Work from Office

Project Role : Application Designer Project Role Description : Assist in defining requirements and designing applications to meet business process and application requirements. Must have skills : SAP PP Production Planning & Control Discrete Industries Good to have skills : NAMinimum 3 year(s) of experience is required Educational Qualification : 15 years full time education Summary :As an Application Designer, you will assist in defining requirements and designing applications to meet business process and application requirements. Your typical day will involve collaborating with cross-functional teams to gather insights, analyzing user needs, and translating them into functional specifications. You will engage in discussions to refine application designs and ensure alignment with business objectives, while also participating in testing and validation processes to guarantee that the applications meet the defined requirements. Your role will be pivotal in enhancing the overall user experience and ensuring that the applications are robust and efficient. Roles & Responsibilities:- Expected to perform independently and become an SME.- Required active participation/contribution in team discussions.- Contribute in providing solutions to work related problems.- Facilitate workshops to gather requirements and feedback from stakeholders.- Develop and maintain comprehensive documentation for application designs and processes. Professional & Technical Skills: - Must To Have Skills: Proficiency in SAP PP Production Planning & Control Process Industries.- Strong understanding of business process modeling and application design principles.- Experience with integration of SAP modules and data flow management.- Ability to analyze and optimize production planning processes.- Familiarity with project management methodologies and tools. Additional Information:- The candidate should have minimum 3 years of experience in SAP PP Production Planning & Control Process Industries.- This position is based at our Mumbai office.- A 15 years full time education is required. Qualification 15 years full time education

Posted 2 months ago

Apply

12.0 - 15.0 years

14 - 19 Lacs

Bengaluru

Work from Office

Project Role : Business and Integration Architect Project Role Description : Designs the integration strategy endpoints and data flow to align technology with business strategy and goals. Understands the entire project life-cycle, including requirements analysis, coding, testing, deployment, and operations to ensure successful integration. Must have skills : Murex Connectivity 2.0 Good to have skills : Murex Back Office WorkflowsMinimum 12 year(s) of experience is required Educational Qualification : 15 years full time education Summary :As a Business and Integration Architect, you will be responsible for designing the integration strategy endpoints and data flow to align technology with business strategy and goals. A typical day involves collaborating with various teams to understand project requirements, analyzing data flows, and ensuring that the integration processes are efficient and effective. You will engage in discussions to refine strategies and provide insights that drive project success, while also overseeing the implementation of integration solutions throughout the project life-cycle, from requirements analysis to deployment and operations. Roles & Responsibilities:- Expected to be an SME.- Collaborate and manage the team to perform.- Responsible for team decisions.- Engage with multiple teams and contribute on key decisions.- Expected to provide solutions to problems that apply across multiple teams.- Facilitate workshops and meetings to gather requirements and align team objectives.- Mentor junior professionals to enhance their skills and understanding of integration strategies. Professional & Technical Skills: - Must To Have Skills: Proficiency in Murex Connectivity 2.0, Murex Back Office Workflows.- Good To Have Skills: Experience with Murex Back Office Workflows.- Strong understanding of integration architecture and design principles.- Experience with data modeling and data flow analysis.- Proficiency in project management methodologies and tools. Additional Information:- The candidate should have minimum 12 years of experience in Murex Connectivity 2.0.- This position is based at our Bengaluru office.- A 15 years full time education is required. Qualification 15 years full time education

Posted 2 months ago

Apply

8.0 - 13.0 years

20 - 30 Lacs

Gurugram

Work from Office

Hi, Wishes from GSN!!! Pleasure connecting with you!!! We been into Corporate Search Services for Identifying & Bringing in Stellar Talented Professionals for our reputed IT / Non-IT clients in India. We have been successfully providing results to various potential needs of our clients for the last 20 years. At present, GSN is hiring GCP ENGINEER for one of our leading MNC client. PFB the details for your better understanding: 1. WORK LOCATION : Gurugram 2. Job Role: GCP ENGINEER 3. EXPERIENCE : 8+ yrs 4. CTC Range: Rs. 20 LPA to Rs. 30 LPA 5. Work Type : WFO (Hybrid) ****** Looking for IMMEDIATE JOINER ****** Who are we looking for ? MLOPS Engineer with AWS Experience. Required Skills : GCP Arch. Certification Terraform GitLab Shell Scripting GCP Services Compute Engine Cloud Storage Data Flow Big Query IAM . ****** Looking for IMMEDIATE JOINER ****** Best regards, Kaviya | GSN | Kaviya@gsnhr.net | 9150016092 | Google review : https://g.co/kgs/UAsF9W

Posted 2 months ago

Apply
cta

Start Your Job Search Today

Browse through a variety of job opportunities tailored to your skills and preferences. Filter by location, experience, salary, and more to find your perfect fit.

Job Application AI Bot

Job Application AI Bot

Apply to 20+ Portals in one click

Download Now

Download the Mobile App

Instantly access job listings, apply easily, and track applications.

Featured Companies