Home
Jobs

233 Data Flow Jobs - Page 5

Filter Interviews
Min: 0 years
Max: 25 years
Min: ₹0
Max: ₹10000000
Setup a job Alert
Filter
JobPe aggregates results for easy application access, but you actually apply on the job portal directly.

4.0 - 9.0 years

4 - 8 Lacs

Pune

Work from Office

Naukri logo

Req ID: 323394 We are currently seeking a Sales and CPQ developer to join our team in Pune, Mahrshtra (IN-MH), India (IN). Basic Qualifications—4+ years of Salesforce consulting experience — 4 years within CPQ and sales cloud space (i.e., 2+ years in Salesforce CPQ implementation, Additional 3+ years"™ experience in other CPQ platforms) — 2+ years Salesforce CPQ implementations — Proven experience implementing CPQ solutions including enterprise architecture, leading a team through ERP integration, & understanding of down-stream processes such as billing, provisioning, etc. — Salesforce CPQ and sales cloud Specialist Certification Preferred Qualifications: — 4+ years CPQ and sales cloud end to end project implementations — Experience with Salesforce administration, configuration, & tools like process builder — Experience with RESTful Service Architecture & CPQ API"™s & QCP (quote calculator plug-in) is a plus — Experience with CPQ deployment tools such as Prodly — Strong CPQ configuration experience and system administration "¢ Troubleshoot and resolve CPQ-related issues, including pricing, discounts, and product configuration logic. — Strong functional knowledge of OOTB Capabilities — Strong knowledge of designing Architecture Diagrams and Data flows — Strong problem-solving skills — Knowledge of End-to-End Order Management lifecycle — Knowledge of Agile methodologies and understanding of software development process — Knowledge of lightning web components & how to address UI/UX requirements effectively when deploying CPQ to channel/distributors/partners — Well-versed with Salesforce security model and Communities experience is a plus — In depth understanding of CPQ architecture (Data, Logic Layers, Data Layers), data models, customizations & extensions — Excellent verbal and written communication skills with ability to tailor messaging to audience — Capable of recommending best practice solutions based on project and business needs and owning overall design of the technical application — Hands on experience on Salesforce Data Loader — Sales Cloud Certification, Salesforce Administrator Certification, App Builder — Technical skills related to Apex and other languages is appreciated but not a requirement Skills Salesforce CPQ Apex Visualforce SOQL REST/SOAP APIs JavaScript Salesforce Lightning Data Migration"

Posted 2 weeks ago

Apply

2.0 - 4.0 years

8 - 12 Lacs

Mumbai

Work from Office

Naukri logo

The SAS to Databricks Migration Developer will be responsible for migrating existing SAS code, data processes, and workflows to the Databricks platform. This role requires expertise in both SAS and Databricks, with a focus on converting SAS logic into scalable PySpark and Python code. The developer will design, implement, and optimize data pipelines, ensuring seamless integration and functionality within the Databricks environment. Collaboration with various teams is essential to understand data requirements and deliver solutions that meet business needs

Posted 2 weeks ago

Apply

5.0 - 10.0 years

15 - 30 Lacs

Pune, Chennai, Bengaluru

Hybrid

Naukri logo

Role - GCP Data Engineer Experience:4+ years Preferred - Data Engineering Background Location - Bangalore, Chennai, Pune, Gurgaon, Kolkata Required Skills - GCP DE Experience, Big query, SQL, Cloud compressor/Python, Cloud functions, Dataproc+pyspark, Python injection, Dataflow+PUB/SUB Here is the job description for the same - Job Requirement: Have Implemented and Architected solutions on Google Cloud Platform using the components of GCP Experience with Apache Beam/Google Dataflow/Apache Spark in creating end to end data pipelines. Experience in some of the following: Python, Hadoop, Spark, SQL, Big Query, Big Table Cloud Storage, Datastore, Spanner, Cloud SQL, Machine Learning. Experience programming in Java, Python, etc. Expertise in at least two of these technologies: Relational Databases, Analytical Databases, NoSQL databases. Certified in Google Professional Data Engineer/ Solution Architect is a major Advantage Skills Required: 3~13 years of experience in IT or professional services experience in IT delivery or large-scale IT analytics projects 3+ years of expertise knowledge of Google Cloud Platform; the other cloud platforms are nice to have. Expert knowledge in SQL development. Expertise in building data integration and preparation tools using cloud technologies (like Snaplogic, Google Dataflow, Cloud Dataprep, Python, etc). Experience with Apache Beam/Google Dataflow/Apache Spark in creating end to end data pipelines. Experience in some of the following: Python, Hadoop, Spark, SQL, Big Query, Big Table Cloud Storage, Datastore, Spanner, Cloud SQL, Machine Learning. Experience programming in Java, Python, etc. Identify downstream implications of data loads/migration (e.g., data quality, regulatory, etc.) Implement data pipelines to automate the ingestion, transformation, and augmentation of data sources, and provide best practices for pipeline operations. Capability to work in a rapidly changing business environment and to enable simplified user access to massive data by building scalable data solutions Advanced SQL writing and experience in data mining (SQL, ETL, data warehouse, etc.) and using databases in a business environment with complex datasets

Posted 2 weeks ago

Apply

3.0 - 7.0 years

3 - 7 Lacs

Mumbai, Maharashtra, India

On-site

Foundit logo

Job description A Data Platform Engineer specialises in the design, build, and maintenance of cloud-based data infrastructure and platforms for data-intensive applications and services. They develop Infrastructure as Code and manage the foundational systems and tools for efficient data storage, processing, and management. This role involves architecting robust and scalable cloud data infrastructure, including selecting and implementing suitable storage solutions, data processing frameworks, and data orchestration tools. Additionally, a Data Platform Engineer ensures the continuous evolution of the data platform to meet changing data needs and leverage technological advancements, while maintaining high levels of data security, availability, and performance. They are also tasked with creating and managing processes and tools that enhance operational efficiency, including optimising data flow and ensuring seamless data integration, all of which are essential for enabling developers to build, deploy, and operate data-centric applications efficiently. Job Description - Grade Specific A senior leadership role that entails the oversight of multiple teams or a substantial team of data platform engineers, the management of intricate data infrastructure projects, and the making of strategic decisions that shape technological direction within the realm of data platform engineering.Key responsibilities encompass:Strategic Leadership: Leading multiple data platform engineering teams, steering substantial projects, and setting the strategic course for data platform development and operations.Complex Project Management: Supervising the execution of intricate data infrastructure projects, ensuring alignment with cliental objectives and the delivery of value.Technical and Strategic Decision-Making: Making well-informed decisions concerning data platform architecture, tools, and processes. Balancing technical considerations with broader business goals.Influencing Technical Direction: Utilising their profound technical expertise in data platform engineering to influence the direction of the team and the client, driving enhancements in data platform technologies and processes.Innovation and Contribution to the Discipline: Serving as innovators and influencers within the field of data platform engineering, contributing to the advancement of the discipline through thought leadership and the sharing of knowledge.Leadership and Mentorship: Offering mentorship and guidance to both managers and technical personnel, cultivating a culture of excellence and innovation within the domain of data platform engineering.

Posted 2 weeks ago

Apply

5.0 - 10.0 years

15 - 30 Lacs

Bengaluru

Remote

Naukri logo

Job Requirement for Offshore Data Engineer (with ML expertise) Work Mode: Remote Base Location: Bengaluru Experience: 5+ Years Technical Skills & Expertise: PySpark & Apache Spark: Extensive experience with PySpark and Spark for big data processing and transformation. Strong understanding of Spark architecture, optimization techniques, and performance tuning. Ability to work with Spark jobs in distributed computing environments like Databricks. Data Mining & Transformation: Hands-on experience in designing and implementing data mining workflows. Expertise in data transformation processes, including ETL (Extract, Transform, Load) pipelines. Experience in large-scale data ingestion, aggregation, and cleaning. Programming Languages: Python & Scala: Proficient in Python for data engineering tasks, including using libraries like Pandas and NumPy. Scala proficiency is preferred for Spark job development. Big Data Concepts: In-depth knowledge of big data frameworks and paradigms, such as distributed file systems, parallel computing, and data partitioning. Big Data Technologies: Cassandra & Hadoop: Experience with NoSQL databases like Cassandra and distributed storage systems like Hadoop. Data Warehousing Tools: Proficiency with Hive for data warehousing solutions and querying. ETL Tools: Experience with Beam architecture and other ETL tools for large-scale data workflows. Cloud Technologies (GCP): Expertise in Google Cloud Platform (GCP), including core services like Cloud Storage, BigQuery, and DataFlow. Experience with DataFlow jobs for batch and stream processing. Familiarity with managing workflows using Airflow for task scheduling and orchestration in GCP. Machine Learning & AI: GenAI Experience: Familiarity with Generative AI and its applications in ML pipelines. ML Model Development: Knowledge of basic ML model building using tools like Pandas, NumPy, and visualization with Matplotlib. ML Ops Pipeline: Experience in managing end-to-end ML Ops pipelines for deploying models in production, particularly LLM (Large Language Models) deployments. RAG Architecture: Understanding and experience in building pipelines using Retrieval-Augmented Generation (RAG) architecture to enhance model performance and output. Tech stack : Spark, Pyspark, Python, Scala, GCP data flow, Data composer (Air flow), ETL, Databricks, Hadoop, Hive, GenAI, ML Modeling basic knowledge, ML Ops experience , LLM deployment, RAG

Posted 2 weeks ago

Apply

3.0 - 5.0 years

4 - 8 Lacs

Pune

Work from Office

Naukri logo

Capgemini Invent Capgemini Invent is the digital innovation, consulting and transformation brand of the Capgemini Group, a global business line that combines market leading expertise in strategy, technology, data science and creative design, to help CxOs envision and build whats next for their businesses. Your Role Has data pipeline implementation experience with any of these cloud providers - AWS, Azure, GCP. Experience with cloud storage, cloud database, cloud data warehousing and Data Lake solutions like Snowflake, Big query, AWS Redshift, ADLS, S3. Has good knowledge of cloud compute services and load balancing. Has good knowledge of cloud identity management, authentication and authorization. Proficiency in using cloud utility functions such as AWS lambda, AWS step functions, Cloud Run, Cloud functions, Azure functions. Experience in using cloud data integration services for structured, semi structured and unstructured data such as Azure Databricks, Azure Data Factory, Azure Synapse Analytics, AWS Glue, AWS EMR, Dataflow, Dataproc. Your Profile Good knowledge of Infra capacity sizing, costing of cloud services to drive optimized solution architecture, leading to optimal infra investment vs performance and scaling. Able to contribute to making architectural choices using various cloud services and solution methodologies. Expertise in programming using python. Very good knowledge of cloud Dev-ops practices such as infrastructure as code, CI/CD components, and automated deployments on cloud. Must understand networking, security, design principles and best practices in cloud. What you will love about working here We recognize the significance of flexible work arrangements to provide support. Be it remote work, or flexible work hours, you will get an environment to maintain healthy work life balance. At the heart of our mission is your career growth. Our array of career growth programs and diverse professions are crafted to support you in exploring a world of opportunities. Equip yourself with valuable certifications in the latest technologies such as Generative AI. About Capgemini Capgemini is a global business and technology transformation partner, helping organizations to accelerate their dual transition to a digital and sustainable world, while creating tangible impact for enterprises and society. It is a responsible and diverse group of 340,000 team members in more than 50 countries. With its strong over 55-year heritage, Capgemini is trusted by its clients to unlock the value of technology to address the entire breadth of their business needs. It delivers end-to-end services and solutions leveraging strengths from strategy and design to engineering, all fueled by its market leading capabilities in AI, cloud and data, combined with its deep industry expertise and partner ecosystem. The Group reported 2023 global revenues of 22.5 billion.

Posted 2 weeks ago

Apply

15.0 - 20.0 years

14 - 19 Lacs

Hyderabad

Work from Office

Naukri logo

Project Role : Business and Integration Architect Project Role Description : Designs the integration strategy endpoints and data flow to align technology with business strategy and goals. Understands the entire project life-cycle, including requirements analysis, coding, testing, deployment, and operations to ensure successful integration. Must have skills : SAP EWM Good to have skills : NAMinimum 5 year(s) of experience is required Educational Qualification : 15 years full time education Summary :As a Business and Integration Architect, you will be responsible for designing the integration strategy endpoints and data flow to align technology with business strategy and goals. Your typical day will involve collaborating with various teams to understand their needs, analyzing project requirements, and ensuring that the integration processes are seamless and efficient. You will engage in discussions to refine strategies, oversee the implementation of solutions, and monitor the overall project lifecycle to ensure that all aspects from coding to deployment are executed effectively. Your role will also require you to stay updated with the latest technologies and methodologies to enhance integration practices and drive business success. Roles & Responsibilities:- Expected to be an SME.- Collaborate and manage the team to perform.- Responsible for team decisions.- Engage with multiple teams and contribute on key decisions.- Provide solutions to problems for their immediate team and across multiple teams.- Facilitate knowledge sharing sessions to enhance team capabilities.- Monitor project progress and ensure alignment with business objectives. Professional & Technical Skills: - Must To Have Skills: Proficiency in SAP EWM.- Strong understanding of integration strategies and data flow management.- Experience with project lifecycle management, including requirements analysis and deployment.- Ability to analyze complex business requirements and translate them into technical solutions.- Familiarity with various integration tools and technologies. Additional Information:- The candidate should have minimum 5 years of experience in SAP EWM.- This position is based at our Hyderabad office.- A 15 years full time education is required. Qualification 15 years full time education

Posted 2 weeks ago

Apply

6.0 - 10.0 years

2 - 6 Lacs

Hyderabad

Work from Office

Naukri logo

Develop SAP BW-IP data flow in S4 HANA system. Provide inputs on data modelling between BW 7.4 on HANA and native HANA using Composite provider, ADSO and open ODS view Excellent communication skills both verbal & written in English are required. Self-motivated, capable to manage own workload with minimum supervision. Creating complex, enterprise-transforming applications within a dynamic, progressive, technically diverse environment Location : Pan India

Posted 2 weeks ago

Apply

6.0 - 11.0 years

20 - 25 Lacs

Bengaluru

Work from Office

Naukri logo

Involved in all aspects of a project and has dual ability to maintain the broad vision required fordesign development of a project, including strategic thinking and leadership. Oversees the intricatedetails of a project from inception through launch. Has a strong understanding of the Client'sbusiness, industry, economic model, organizational trends and customer needs in order to lead withrelevant digital marketing solutions. Partner across teams to recommend and determine appropriate project execution models (waterfalland/or agile practices) and determine project engagement type (Retainer, Fixed Fee, Time andMaterials) as part of solutioning. Establishes and maintains a Center of excellence for every projectwith Client, and/or Subject Matter Experts (SMEs). Serves as a central point of contact for projectestimates, utilizing department leads and SMEs to determine estimates for their team's activities. Facilitates the creation of accurate project plans with clearly defined milestones, deliverables andtasks. Work with department SMEs to determine department level deliverables and create resourceallocation/staffing plans for the lifecycle of the project. Experience with IBM Design Thinking, Agile,DevOps, Scrum, SAFe, LeSS and SDLC. Design major aspects of the architecture of an application including components, UI, middlewareand databases. Provide technical leadership to the application development team. Proficient in performing design and code reviews. Ensure application design standards are maintained. Create and maintain documentations surrounding software architecture, application design processes, component integrations, testing guidelines etc. Responsible for training developers and refining the technical expertise. Problem solving skills to effectively identify and develop architectural systems that meet the needsof clients Required education Bachelor's Degree Preferred education Master's Degree Required technical and professional expertise For a given scope area, evaluate architectural models and perform/drive in-depth analysis ofsystems, data flows processes, and KPIs/metrics about the current state Develop understanding of business processes, data flows, strategy, and long-term thinking to comeup with an end-state architecture for large and complex systems Full-stack software Architecture Expertise, Designing and Developing full stack modules andcomponents for web applications (frontend and backend services) Working experience on MEAN (Mongo, Express, Angular, Node), MERN (Mongo, Express, React, Node)stacks Consumer Web Development Experience for High-Traffic, Public Facing web applications Preferred technical and professional experience Experience in working with AB Test frameworks such as Optimizely, Experience in using front end monitoring tools to troubleshoot errors and recognize performance bottlenecks Experience in designing UX for simplifying user experience and dashboards for viewing high volume of information Expertise in hosting and configuring Data Annotation tools, defining meta data for media types such as Images, Audio, Video, model-based data capture, Preferred Experience in Playing liaison role with ML Engineers, Data Scientists, Data Analysts to translate business requirements to conceptual designs

Posted 2 weeks ago

Apply

5.0 - 7.0 years

13 - 17 Lacs

Hyderabad

Work from Office

Naukri logo

Skilled Multiple GCP services - GCS, BigQuery, Cloud SQL, Dataflow, Pub/Sub, Cloud Run, Workflow, Composer, Error reporting, Log explorer etc. Must have Python and SQL work experience & Proactive, collaborative and ability to respond to critical situation Ability to analyse data for functional business requirements & front face customer Required education Bachelor's Degree Preferred education Master's Degree Required technical and professional expertise 5 to 7 years of relevant experience working as technical analyst with Big Query on GCP platform. Skilled in multiple GCP services - GCS, Cloud SQL, Dataflow, Pub/Sub, Cloud Run, Workflow, Composer, Error reporting, Log explorer You love collaborative environments that use agile methodologies to encourage creative design thinking and find innovative ways to develop with cutting edge technologies Ambitious individual who can work under their own direction towards agreed targets/goals and with creative approach to work Preferred technical and professional experience Create up to 3 bullets maxitive individual with an ability to manage change and proven time management Proven interpersonal skills while contributing to team effort by accomplishing related results as needed Up-to-date technical knowledge by attending educational workshops, reviewing publications (encouraging then to focus on required skills)

Posted 2 weeks ago

Apply

1.0 - 3.0 years

3 - 6 Lacs

Hyderabad, Pune, Bengaluru

Work from Office

Naukri logo

Locations - Pune/Bangalore/Hyderabad/Indore Contract duration- 6 months Responsibilities Be responsible for the development of the conceptual, logical, and physical data models, the implementation of RDBMS, operational data store (ODS), data marts, and data lakes on target platforms. Implement business and IT data requirements through new data strategies and designs across all data platforms (relational & dimensional - MUST and NoSQL-optional) and data tools (reporting, visualization, analytics, and machine learning). Work with business and application/solution teams to implement data strategies, build data flows, and develop conceptual/logical/physical data models Define and govern data modeling and design standards, tools, best practices, and related development for enterprise data models. Identify the architecture, infrastructure, and interfaces to data sources, tools supporting automated data loads, security concerns, analytic models, and data visualization. Hands-on modeling, design, configuration, installation, performance tuning, and sandbox POC. Work proactively and independently to address project requirements and articulate issues/challenges to reduce project delivery risks. Must have Payments Background Skills Hands-on relational, dimensional, and/or analytic experience (using RDBMS, dimensional, NoSQL data platform technologies, and ETL and data ingestion protocols). Experience with data warehouse, data lake, and enterprise big data platforms in multi-data-center contexts required. Good knowledge of metadata management, data modeling, and related tools (Erwin or ER Studio or others) required. Experience in team management, communication, and presentation. Experience with Erwin, Visio or any other relevant tool.

Posted 2 weeks ago

Apply

9.0 - 10.0 years

12 - 14 Lacs

Hyderabad

Work from Office

Naukri logo

Responsibilities: * Design, develop & maintain data pipelines using Airflow/Data Flow/Data Lake * Optimize performance & scalability of ETL processes with SQL & Python

Posted 2 weeks ago

Apply

2.0 - 7.0 years

1 - 6 Lacs

Hyderabad, Qatar

Work from Office

Naukri logo

SUMMARY Job Summary: Exciting job opportunity as a Registered Nurse in Qatar (Homecare) Key Responsibilities: Develop and assess nursing care plans Monitor vital signs and assess holistic patient needs Collaborate with physicians, staff nurses, and healthcare team members Administer oral and subcutaneous medications while ensuring safety Document nursing care, medications, and procedures using the company's Nurses Buddy application Conduct client assessment and reassessment using approved tools Attend refresher training courses, seminars, and training Timeline for Migration: Application to Selection: Not more than 5 days Data flow & Prometric: 1 month Visa processing: 1-2 months Start working in Qatar within 3 months! Requirements: Educational Qualification: Bachelor's Degree in Nursing or GNM Experience: Minimum 2 years working experience as a Nurse post registration Citizenship: Indian Ag e limit: below 45 years Certification: registration Certification from Nursing Council Language: Basic English proficiency required Technical Skills: Bed side nursing, patient care, patient assessment and monitoring Benefits: High Salary & Perks: Earn 5000 QAR / month (1,18,000 INR/month) Tax Benefit: No tax deduction on salary Career Growth: Advanced Nursing career in Qatar with competitive salaries, cutting-edge facilities, and opportunities for specialization Relocation support: Visa process and flight sponsored. Free accommodation and transportation provided. International Work Experience: Boost your resume with International healthcare expertise. Comprehensive Health Insurance: Medical coverage for under Qatar’s healthcare system. S afe and stable environment: Qatar is known for its low crime rate, political stability, and high quality of life. The strict laws in the country, makes it one of safest place to live. Faster Visa Processing With efficient government procedures, work visas for nurses are processed quickly, reducing waiting times. Simplified Licensing Process Compared to other countries, Qatar offers a streamlined process for obtaining a nursing license through QCHP (Qatar Council for Healthcare Practitioners) . Direct Hiring Opportunities Many hospitals and healthcare facilities offer direct recruitment , minimizing third-party delays and complications. Limited slots available! Apply now to secure your place in the next batch of Nurses migrating to Qatar!

Posted 2 weeks ago

Apply

2.0 - 7.0 years

3 - 7 Lacs

Noida

Work from Office

Naukri logo

Who we are: R1 is a leading provider of technology-driven solutions that help hospitals and health systems to manage their financial systems and improve patients experience. We are the one company that combines the deep expertise of a global workforce of revenue cycle professionals with the industrys most advanced technology platform, encompassing sophisticated analytics, Al, intelligent automation and workflow orchestration. R1 is a place where we think boldly to create opportunities for everyone to innovate and grow. A place where we partner with purpose through transparency and inclusion. We are a global community of engineers, front-line associates, healthcare operators, and RCM experts that work together to go beyond for all those we serve. Because we know that all this adds up to something more, a place where were all together better. R1 India is proud to be recognized amongst Top 25 Best Companies to Work For 2024, by the Great Place to Work Institute. This is our second consecutive recognition on this prestigious Best Workplaces list, building on the Top 50 recognition we achieved in 2023. Our focus on employee wellbeing and inclusion and diversity is demonstrated through prestigious recognitions with R1 India being ranked amongst Best in Healthcare, amongst Top 50 Best Workplaces„¢ for Millennials, Top 50 for Women, Top 25 for Diversity and Inclusion and Top 10 for Health and Wellness. We are committed to transform the healthcare industry with our innovative revenue cycle management services. Our goal is to make healthcare work better for all by enabling efficiency for healthcare systems, hospitals, and physician practices. With over 30,000 employees globally, we are about 17,000+ strong in India with presence in Delhi NCR, Hyderabad, Bangalore, and Chennai. Our inclusive culture ensures that every employee feels valued, respected, and appreciated with a robust set of employee benefits and engagement activities. About the role Needs to work closely and communicate effectively with internal and external stakeholders in an ever-changing, rapid growth environment with tight deadlines. This role involves analyzing healthcare data and model on proprietary tools. Be able to take up new initiatives independently and collaborate with external and internal stakeholders. Be a strong team player. Be able to create and define SOPs, TATs for ongoing and upcoming projects. What will you need: "ƒ"ƒ Graduate in any discipline (preferably via regular attendance) from a recognized educational institute with good academic track record Should have Live hands-on experience of at-least 2 year in Advance Analytical Tool (Power BI, Tableau, SQL) should have solid understanding of SSIS (ETL) with strong SQL & PL SQL" Connecting to data sources, importing data and transforming data for Business Intelligence." Should have expertise in DAX & Visuals in Power BI and live Hand-On experience on end-to-end project Strong mathematical skills to help collect, measure, organize and analyze data." Interpret data, analyze results using advance analytical tools & techniques and provide ongoing reports Identify, analyze, and interpret trends or patterns in complex data sets Ability to communicate with technical and business resources at many levels in a manner that supports progress and success. Ability to understand, appreciate and adapt to new business cultures and ways of working. Demonstrates initiative and works independently with minimal supervision. Working in an evolving healthcare setting, we use our shared expertise to deliver innovative solutions. Our fast-growing team has opportunities to learn and grow through rewarding interactions, collaboration and the freedom to explore professional interests. Our associates are given valuable opportunities to contribute, to innovate and create meaningful work that makes an impact in the communities we serve around the world. We also offer a culture of excellence that drives customer success and improves patient care. We believe in giving back to the community and offer a competitive benefits package. To learn more, visitr1rcm.com Visit us on Facebook

Posted 2 weeks ago

Apply

1.0 - 4.0 years

3 - 7 Lacs

Bengaluru

Work from Office

Naukri logo

Build data integrations, data models to support analytical needs for this project. below. Translate business requirements into technical requirements as needed Design and develop automated scripts for data pipelines to process and transform as per the requirements and monitor those Produce artifacts such as data flow diagrams, designs, data model along with git code as deliverable Use tools or programming languages such as SQL, Python, Snowflake, Airflow, dbt, Salesforce Data cloud Ensure data accuracy, timeliness, and reliability throughout the pipeline. Complete QA, data profiling to ensure data is ready as per the requirements for UAT Collaborate with stakeholders on business, Visualization team and support enhancements Timely updates on the sprint boards, task updates Team lead to provide timely project updates on all the projects Project experience with version control systems and CICD such as GIT, GitFlow, Bitbucket, Jenkins etc. Participate in UAT to resolve findings and plan Go Live/Production deployment Milestones: Data Integration Plan into Data Cloud for structured and unstructured data/RAG needs for the Sales AI use cases Design Data Models and semantic layer on Salesforce AI Agentforce Prompt Integration Data Quality and sourcing enhancements Write Agentforce Prompts and refine as needed Assist decision scientist on the data needs Collaborate with EA team and participate in design review Performance Tuning and Optimization of Data Pipelines Hypercare after the deployment Project Review and Knowledge Transfer

Posted 2 weeks ago

Apply

2.0 - 7.0 years

5 - 9 Lacs

Gurugram

Work from Office

Naukri logo

Who we are: R1 is a leading provider of technology-driven solutions that help hospitals and health systems to manage their financial systems and improve patients experience. We are the one company that combines the deep expertise of a global workforce of revenue cycle professionals with the industrys most advanced technology platform, encompassing sophisticated analytics, Al, intelligent automation and workflow orchestration. R1 is a place where we think boldly to create opportunities for everyone to innovate and grow. A place where we partner with purpose through transparency and inclusion. We are a global community of engineers, front-line associates, healthcare operators, and RCM experts that work together to go beyond for all those we serve. Because we know that all this adds up to something more, a place where were all together better. R1 India is proud to be recognized amongst Top 25 Best Companies to Work For 2024, by the Great Place to Work Institute. This is our second consecutive recognition on this prestigious Best Workplaces list, building on the Top 50 recognition we achieved in 2023. Our focus on employee wellbeing and inclusion and diversity is demonstrated through prestigious recognitions with R1 India being ranked amongst Best in Healthcare, amongst Top 50 Best Workplaces„¢ for Millennials, Top 50 for Women, Top 25 for Diversity and Inclusion and Top 10 for Health and Wellness. We are committed to transform the healthcare industry with our innovative revenue cycle management services. Our goal is to make healthcare work better for all by enabling efficiency for healthcare systems, hospitals, and physician practices. With over 30,000 employees globally, we are about 17,000+ strong in India with presence in Delhi NCR, Hyderabad, Bangalore, and Chennai. Our inclusive culture ensures that every employee feels valued, respected, and appreciated with a robust set of employee benefits and engagement activities. About the role Needs to work closely and communicate effectively with internal and external stakeholders in an ever-changing, rapid growth environment with tight deadlines. This role involves analyzing healthcare data and model on proprietary tools. Be able to take up new initiatives independently and collaborate with external and internal stakeholders. Be a strong team player. Be able to create and define SOPs, TATs for ongoing and upcoming projects. What will you need: "ƒ"ƒ Graduate in any discipline (preferably via regular attendance) from a recognized educational institute with good academic track record Should have Live hands-on experience of at-least 2 year in Advance Analytical Tool (Power BI, Tableau, SQL) should have solid understanding of SSIS (ETL) with strong SQL & PL SQL" Connecting to data sources, importing data and transforming data for Business Intelligence." Should have expertise in DAX & Visuals in Power BI and live Hand-On experience on end-to-end project Strong mathematical skills to help collect, measure, organize and analyze data." Interpret data, analyze results using advance analytical tools & techniques and provide ongoing reports Identify, analyze, and interpret trends or patterns in complex data sets Ability to communicate with technical and business resources at many levels in a manner that supports progress and success. Ability to understand, appreciate and adapt to new business cultures and ways of working. Demonstrates initiative and works independently with minimal supervision. Working in an evolving healthcare setting, we use our shared expertise to deliver innovative solutions. Our fast-growing team has opportunities to learn and grow through rewarding interactions, collaboration and the freedom to explore professional interests. Our associates are given valuable opportunities to contribute, to innovate and create meaningful work that makes an impact in the communities we serve around the world. We also offer a culture of excellence that drives customer success and improves patient care. We believe in giving back to the community and offer a competitive benefits package. To learn more, visitr1rcm.com Visit us on Facebook

Posted 2 weeks ago

Apply

5.0 - 9.0 years

20 - 25 Lacs

Pune

Work from Office

Naukri logo

Primary Responsibilities Provide engineering leadership, mentorship, technical direction to small team of other engineers (~6 members). Partner with your Engineering Manager to ensure engineering tasks understood, broken down and implemented to the highest of quality standards. Collaborate with members of the team to solve challenging engineering tasks on time and with high quality. Engage in code reviews and training of team members. Support continuous deployment pipeline code. Situationally troubleshoot production issues alongside the support team. Continually research and recommend product improvements. Create and integrate features for our enterprise software solution using the latest Python technologies. Assist and adhere to enforcement of project deadlines and schedules. Evaluate, recommend, and proposed solutions to existing systems. Actively communicate with team members to clarify requirements and overcome obstacles to meet the team goals. Leverage open-source and other technologies and languages outside of the Python platform. Develop cutting-edge solutions to maximize the performance, scalability, and distributed processing capabilities of the system. Provide troubleshooting and root cause analysis for production issues that are escalated to the engineering team. Work with development teams in an agile context as it relates to software development, including Kanban, automated unit testing, test fixtures, and pair programming. Requirement of 4-8or more years experience as a Python developer on enterprise projects using Python, Flask, FastAPI, Django, PyTest, Celery and other Python frameworks. Software development experience includingobject-oriented programming, concurrency programming, modern design patterns, RESTful service implementation, micro-service architecture, test-driven development, and acceptance testing. Familiarity with tools used to automate the deployment of an enterprise software solution to the cloud, Terraform, GitHub Actions, Concourse, Ansible, etc. Proficiency with Git as a version control system Experience with Docker and Kubernetes Experience with relational SQL and NoSQL databases, including MongoDB and MSSQL. Experience with object-oriented languagesPython, Java, Scala, C#, etc. Experience with testing tools such as PyTest, Wiremock, xUnit, mocking frameworks, etc. Experience with GCP technologies such as BigQuery, GKE, GCS, DataFlow, Kubeflow, and/or VertexAI Excellent problem solving and communication skills. Experience with Java and Spring a big plus. Disability Accommodation: UKGCareers@ukg.com.

Posted 2 weeks ago

Apply

5.0 - 10.0 years

35 - 40 Lacs

Pune

Work from Office

Naukri logo

: Job Title- Data Engineer (ETL, Big Data, Hadoop, Spark, GCP), AVP Location- Pune, India Role Description Engineer is responsible for developing and delivering elements of engineering solutions to accomplish business goals. Awareness is expected of the important engineering principles of the bank. Root cause analysis skills develop through addressing enhancements and fixes 2 products build reliability and resiliency into solutions through early testing peer reviews and automating the delivery life cycle. Successful candidate should be able to work independently on medium to large sized projects with strict deadlines. Successful candidates should be able to work in a cross application mixed technical environment and must demonstrate solid hands-on development track record while working on an agile methodology. The role demands working alongside a geographically dispersed team. The position is required as a part of the buildout of Compliance tech internal development team in India. The overall team will primarily deliver improvements in Com in compliance tech capabilities that are major components of the regular regulatory portfolio addressing various regulatory common commitments to mandate monitors. What we'll offer you As part of our flexible scheme, here are just some of the benefits that youll enjoy Best in class leave policy Gender neutral parental leaves 100% reimbursement under childcare assistance benefit (gender neutral) Sponsorship for Industry relevant certifications and education Employee Assistance Program for you and your family members Comprehensive Hospitalization Insurance for you and your dependents Accident and Term life Insurance Complementary Health screening for 35 yrs. and above Your key responsibilities Analyzing data sets and designing and coding stable and scalable data ingestion workflows also integrating into existing workflows Working with team members and stakeholders to clarify requirements and provide the appropriate ETL solution. Work as a senior developer for developing analytics algorithm on top of ingested data. Work as though senior developer for various data sourcing in Hadoop also GCP. Ensuring new code is tested both at unit level and system level design develop and peer review new code and functionality. Operate as a team member of an agile scrum team. Root cause analysis skills to identify bugs and issues for failures. Support Prod support and release management teams in their tasks. Your skills and experience: More than 6+ years of coding experience in experience and reputed organizations Hands on experience in Bitbucket and CI/CD pipelines Proficient in Hadoop, Python, Spark ,SQL Unix and Hive Basic understanding of on Prem and GCP data security Hands on development experience on large ETL/ big data systems .GCP being a big plus Hands on experience on cloud build, artifact registry ,cloud DNS ,cloud load balancing etc. Hands on experience on Data flow, Cloud composer, Cloud storage ,Data proc etc. Basic understanding of data quality dimensions like Consistency, Completeness, Accuracy, Lineage etc. Hands on business and systems knowledge gained in a regulatory delivery environment. Banking experience regulatory and cross product knowledge. Passionate about test driven development. Prior experience with release management tasks and responsibilities. Data visualization experience in tableau is good to have. How we'll support you Training and development to help you excel in your career. Coaching and support from experts in your team A culture of continuous learning to aid progression. A range of flexible benefits that you can tailor to suit your needs.

Posted 2 weeks ago

Apply

3.0 - 5.0 years

10 - 14 Lacs

Pune

Work from Office

Naukri logo

: Job TitleGCP Data Engineer, AS LocationPune, India Corporate TitleAssociate Role Description An Engineer is responsible for designing and developing entire engineering solutions to accomplish business goals. Key responsibilities of this role include ensuring that solutions are well architected, with maintainability and ease of testing built in from the outset, and that they can be integrated successfully into the end-to-end business process flow. They will have gained significant experience through multiple implementations and have begun to develop both depth and breadth in several engineering competencies. They have extensive knowledge of design and architectural patterns.They will provide engineering thought leadership within their teams and will play a role in mentoring and coaching of less experienced engineers. What we'll offer you As part of our flexible scheme, here are just some of the benefits that youll enjoy Best in class leave policy Gender neutral parental leaves 100% reimbursement under childcare assistance benefit (gender neutral) Sponsorship for Industry relevant certifications and education Employee Assistance Program for you and your family members Comprehensive Hospitalization Insurance for you and your dependents Accident and Term life Insurance Complementary Health screening for 35 yrs. and above Your key responsibilities Design, develop and maintain data pipelines using Python and SQL programming language on GCP. Experience in Agile methodologies, ETL, ELT, Data movement and Data processing skills. Work with Cloud Composer to manage and process batch data jobs efficiently. Develop and optimize complex SQL queries for data analysis, extraction, and transformation. Develop and deploy google cloud services using Terraform. Implement CI CD pipeline using GitHub Action Consume and Hosting REST API using Python. Monitor and troubleshoot data pipelines, resolving any issues in a timely manner. Ensure team collaboration using Jira, Confluence, and other tools. Ability to quickly learn new any existing technologies Strong problem-solving skills. Write advanced SQL and Python scripts. Certification on Professional Google Cloud Data engineer will be an added advantage. Your skills and experience 6+ years of IT experience, as a hands-on technologist. Proficient in Python for data engineering. Proficient in SQL. Hands on experience on GCP Cloud Composer, Data Flow, Big Query, Cloud Function, Cloud Run and well to have GKE Hands on experience in REST API hosting and consumptions. Proficient in Terraform/ Hashicorp. Experienced in GitHub and Git Actions Experienced in CI-CD Experience in automating ETL testing using python and SQL. Good to have API knowledge Good to have Bit Bucket How we'll support you Training and development to help you excel in your career Coaching and support from experts in your team A culture of continuous learning to aid progression A range of flexible benefits that you can tailor to suit your needs

Posted 2 weeks ago

Apply

3.0 - 5.0 years

32 - 40 Lacs

Pune

Work from Office

Naukri logo

: Job TitleSenior Engineer, VP LocationPune, India Role Description Engineer is responsible for managing or performing work across multiple areas of the bank's overall IT Platform/Infrastructure including analysis, development, and administration. It may also involve taking functional oversight of engineering delivery for specific departments. Work includes: Planning and developing entire engineering solutions to accomplish business goals Building reliability and resiliency into solutions with appropriate testing and reviewing throughout the delivery lifecycle Ensuring maintainability and reusability of engineering solutions Ensuring solutions are well architected and can be integrated successfully into the end-to-end business process flow Reviewing engineering plans and quality to drive re-use and improve engineering capability Participating in industry forums to drive adoption of innovative technologies, tools and solutions in the Bank Engineer is responsible for managing or performing work across multiple areas of the bank's overall IT Platform/Infrastructure including analysis, development, and administration. It may also involve taking functional oversight of engineering delivery for specific departments. Work includes: Planning and developing entire engineering solutions to accomplish business goals Building reliability and resiliency into solutions with appropriate testing and reviewing throughout the delivery lifecycle Ensuring maintainability and reusability of engineering solutions Ensuring solutions are well architected and can be integrated successfully into the endto-end business process flow Reviewing engineering plans and quality to drive re-use and improve engineering capability Participating in industry forums to drive adoption of innovative technologies, tools and solutions in the Bank. Deutsche Banks Corporate Bank division is a leading provider of cash management, trade finance and securities finance. We complete green-field projects that deliver the best Corporate Bank - Securities Services products in the world. Our team is diverse, international, and driven by shared focus on clean code and valued delivery. At every level, agile minds are rewarded with competitive pay, support, and opportunities to excel.You will work as part of a cross-functional agile delivery team. You will bring an innovative approach to software development, focusing on using the latest technologies and practices, as part of a relentless focus on business value. You will be someone who sees engineering as team activity, with a predisposition to open code, open discussion and creating a supportive, collaborative environment. You will be ready to contribute to all stages of software delivery, from initial analysis right through to production support." What we'll offer you As part of our flexible scheme, here are just some of the benefits that youll enjoy, Best in class leave policy. Gender neutral parental leaves 100% reimbursement under childcare assistance benefit (gender neutral) Sponsorship for Industry relevant certifications and education Employee Assistance Program for you and your family members Comprehensive Hospitalization Insurance for you and your dependents Accident and Term life Insurance Complementary Health screening for 35 yrs. and above Your key responsibilities The candidate is expected to Hands-on engineering lead involved in analysis, design, design/code reviews, coding and release activities Champion engineering best practices and guide/mentor team to achieve high performance. Work closely with Business stakeholders, Tribe lead, Product Owner, Lead Architect to successfully deliver the business outcomes. Acquire functional knowledge of the business capability being digitized/re-engineered. Demonstrate ownership, inspire others, innovative thinking, growth mindset and collaborate for success. Your skills and experience Minimum 15 years of IT industry experience in Full stack development Expert in Java, Spring Boot, NodeJS, ReactJS, Strong experience in Big data processing Apache Spark, Hadoop, Bigquery, DataProc, Dataflow etc Strong experience in Kubernetes, OpenShift container platform Experience in Data streaming i.e. Kafka, Pub-sub etc Experience of working on public cloud GCP preferred, AWS or Azure Knowledge of various distributed/multi-tiered architecture styles Micro-services, Data mesh, Integrationpatterns etc Experience on modern software product delivery practices, processes and tooling and BIzDevOps skills such asCI/CD pipelines using Jenkins, Git Actions etc Experience on leading teams and mentoring developers Key Skill: Java Spring Boot NodeJS SQL/PLSQL ReactJS Advantageous: Having prior experience in Banking/Finance domain Having worked on hybrid cloud solutions preferably using GCP Having worked on product development How we'll support you Training and development to help you excel in your career. Coaching and support from experts in your team. A culture of continuous learning to aid progression. A range of flexible benefits that you can tailor to suit your needs. About us and our teams Please visit our company website for further information: https://www.db.com/company/company.htm We strive for a culture in which we are empowered to excel together every day. This includes acting responsibly, thinking commercially, taking initiative and working collaboratively. Together we share and celebrate the successes of our people. Together we are Deutsche Bank Group. We welcome applications from all people and promote a positive, fair and inclusive work environment.

Posted 2 weeks ago

Apply

9.0 - 14.0 years

11 - 16 Lacs

Bengaluru

Work from Office

Naukri logo

About us: As a Fortune 50 company with more than 400,000 team members worldwide, Target is an iconic brand and one of America's leading retailers. Joining Target means promoting a culture of mutual care and respect and striving to make the most meaningful and positive impact. Becoming a Target team member means joining a community that values different voices and lifts each other up. Here, we believe your unique perspective is important, and you'll build relationships by being authentic and respectful. Overview about TII: At Target, we have a timeless purpose and a proven strategy. And that hasn t happened by accident. Some of the best minds from different backgrounds come together at Target to redefine retail in an inclusive learning environment that values people and delivers world-class outcomes. That winning formula is especially apparent in Bengaluru, where Target in India operates as a fully integrated part of Target s global team and has more than 4,000 team members supporting the company s global strategy and operations. Team Overview: Every time a guest enters a Target store or browses Target.com nor the app, they experience the impact of Target s investments in technology and innovation. We re the technologists behind one of the most loved retail brands, delivering joy to millions of our guests, team members, and communities. Join our global in-house technology team of more than 5,000 of engineers, data scientists, architects and product managers striving to make Target the most convenient, safe and joyful place to shop. We use agile practices and leverage open-source software to adapt and build best-in-class technology for our team members and guests and we do so with a focus on diversity and inclusion, experimentation and continuous learning. Position Overview: We are building Machine Learning Platform to enable MLOPs capabilities to help Data scientists and ML engineers at Target to implement ML solutions at scale. It encompasses building the Featurestore, Model ops, experimentation, iteration, monitoring, explainability, and continuous improvement of the machine learning lifecycle. You will be part of a team building scalable applications by leverage latest technologies. Connect with us if you want to join us in this exiting journey. Roles and responsibilities: Build and maintain Machine learning infrastructure that is scalable, reliable and efficient. Familiar with Google cloud infrastructure and MLOPS Write highly scalable APIs. Deploy and maintain machine learning models, pipelines and workflows in production environment. Collaborate with data scientists and software engineers to design and implement machine learning workflows. Implement monitoring and logging tools to ensure that machine learning models are performing optimally. Continuously improve the performance, scalability and reliability of machine learning systems. Work with teams to deploy and manage infrastructure for machine learning services. Create and maintain technical documentation for machine learning infrastructure and workflows. Stay up to date with the latest developments in technologies. Tech stack: GCP cloud skills, GCP Machine Learning Engineer skills , GCP VertexAI skills, Python, Microservices, API development Cassandra, Elastic Search, Postgres, Kafka, Docker, CICD, optional (Java + Spring boot) Required Skills: Bachelor's or Master's degree in computer science, engineering or related field. 9+ years of experience in software development, machine learning engineering. A Lead Machine Learning Engineer specializing in Google Cloud (GCP ) needs a deep understanding of machine learning (ML) principles, cloud infrastructure and MLOps Hands-on experience with Vertex AI to manage ML platform for Feature engineering, ML training & deploying models VertexAI Skills needed areBigQueyML, Automating ML workflows using Kubeflow (KFP) or Cloud composer, AI APIs, Endpoints for real-time inference, Model Monitoring, Cloud Logging & Monitoring, Cloud Dataflow for stream processing, Cloud Dataproc (Spark & Hadoop) for distributed ML workloads Deep experience with Python, API development, microservices. Creating ML-powered REST APIs using FastAPI, Flask, Cloud Functions Java (Optional, but useful for production ML systems) Expert in building high-performance APIs. Experience with DevOps practices, containerization and tools such as Kubernetes, Docker, Jenkins, Git. Good understanding of machine learning concepts and frameworks, deep learning, LLM etc. Good to have experience in deploying machine learning models in a production environment. Good to have experience with data streaming technologies such as Kafka, Dataflow, Kinesis, Pub/Sub etc. Strong analytical and problem-solving skills Good to have GCP certification - Professional Machine Learning Engineer

Posted 2 weeks ago

Apply

6.0 - 10.0 years

2 - 6 Lacs

Hyderabad

Work from Office

Naukri logo

Job Information Job Opening ID ZR_1958_JOB Date Opened 16/05/2023 Industry Technology Job Type Work Experience 6-10 years Job Title SAP BW IP City Hyderabad Province Telangana Country India Postal Code 500001 Number of Positions 5 Develop SAP BW-IP data flow in S4 HANA system. Provide inputs on data modelling between BW 7.4 on HANA and native HANA using Composite provider, ADSO and open ODS view Excellent communication skills both verbal & written in English are required. Self-motivated, capable to manage own workload with minimum supervision. Creating complex, enterprise-transforming applications within a dynamic, progressive, technically diverse environment Location : Pan India check(event) ; career-website-detail-template-2 => apply(record.id,meta)" mousedown="lyte-button => check(event)" final-style="background-color:#2B39C2;border-color:#2B39C2;color:white;" final-class="lyte-button lyteBackgroundColorBtn lyteSuccess" lyte-rendered=""> I'm interested

Posted 2 weeks ago

Apply

8.0 - 12.0 years

3 - 6 Lacs

Bengaluru

Work from Office

Naukri logo

Job Information Job Opening ID ZR_2385_JOB Date Opened 23/10/2024 Industry IT Services Job Type Work Experience 8-12 years Job Title Data modeller City Bangalore South Province Karnataka Country India Postal Code 560066 Number of Positions 1 Locations - Pune/Bangalore/Hyderabad/Indore Contract duration- 6 months Responsibilities Be responsible for the development of the conceptual, logical, and physical data models, the implementation of RDBMS, operational data store (ODS), data marts, and data lakes on target platforms. Implement business and IT data requirements through new data strategies and designs across all data platforms (relational & dimensional - MUST and NoSQL-optional) and data tools (reporting, visualization, analytics, and machine learning). Work with business and application/solution teams to implement data strategies, build data flows, and develop conceptual/logical/physical data models Define and govern data modeling and design standards, tools, best practices, and related development for enterprise data models. Identify the architecture, infrastructure, and interfaces to data sources, tools supporting automated data loads, security concerns, analytic models, and data visualization. Hands-on modeling, design, configuration, installation, performance tuning, and sandbox POC. Work proactively and independently to address project requirements and articulate issues/challenges to reduce project delivery risks. Must have Payments Background Skills Hands-on relational, dimensional, and/or analytic experience (using RDBMS, dimensional, NoSQL data platform technologies, and ETL and data ingestion protocols). Experience with data warehouse, data lake, and enterprise big data platforms in multi-data-center contexts required. Good knowledge of metadata management, data modeling, and related tools (Erwin or ER Studio or others) required. Experience in team management, communication, and presentation. Experience with Erwin, Visio or any other relevant tool. check(event) ; career-website-detail-template-2 => apply(record.id,meta)" mousedown="lyte-button => check(event)" final-style="background-color:#2B39C2;border-color:#2B39C2;color:white;" final-class="lyte-button lyteBackgroundColorBtn lyteSuccess" lyte-rendered=""> I'm interested

Posted 2 weeks ago

Apply

7.0 - 12.0 years

25 - 37 Lacs

Hyderabad, Pune

Work from Office

Naukri logo

GCP Data Engineer (Big Query + SQL + ETL Knowledge + Python, Data Flow, Pubsub, CICD) "KASHIF@D2NSOLUTIONS.COM"

Posted 2 weeks ago

Apply

13.0 - 18.0 years

13 - 17 Lacs

Bengaluru

Work from Office

Naukri logo

Skill required: Tech for Operations - Agile Project Management Designation: AI/ML Computational Science Manager Qualifications: Any Graduation Years of Experience: 13 to 18 years What would you do? You will be part of the Technology for Operations team that acts as a trusted advisor and partner to Accenture Operations. The team provides innovative and secure technologies to help clients build an intelligent operating model, driving exceptional results. We work closely with the sales, offering and delivery teams to identify and build innovative solutions.The Tech For Operations (TFO) team provides innovative and secure technologies to help clients build an intelligent operating model, driving exceptional results. Works closely with the sales, offering and delivery teams to identify and build innovative solutions. Major sub deals include AHO(Application Hosting Operations), ISMT (Infrastructure Management), Intelligent AutomationAn iterative, incremental method of managing the design and build activities of engineering, information technology and other business areas that aim to provide new product or service development in a highly flexible and interactive manner. It requires individuals and interactions from the relevant business to respond to change, customer collaboration, and management openness to non-hierarchical forms of leadership. What are we looking for? Results orientation Problem-solving skills Ability to perform under pressure Strong analytical skills Written and verbal communication Roles and Responsibilities: In this role you are required to identify and assess complex problems for area of responsibility The person would create solutions in situations in which analysis requires an in-depth evaluation of variable factors Requires adherence to strategic direction set by senior management when establishing near-term goals Interaction of the individual is with senior management at a client and/or within Accenture, involving matters that may require acceptance of an alternate approach Some latitude in decision-making in involved you will act independently to determine methods and procedures on new assignments Decisions individual at this role makes have a major day to day impact on area of responsibility The person manages large - medium sized teams and/or work efforts (if in an individual contributor role) at a client or within Accenture Please note that this role may require you to work in rotational shifts Qualification Any Graduation

Posted 3 weeks ago

Apply
cta

Start Your Job Search Today

Browse through a variety of job opportunities tailored to your skills and preferences. Filter by location, experience, salary, and more to find your perfect fit.

Job Application AI Bot

Job Application AI Bot

Apply to 20+ Portals in one click

Download Now

Download the Mobile App

Instantly access job listings, apply easily, and track applications.

Featured Companies