Home
Jobs

Senior Process Manager

7 - 12 years

9 - 14 Lacs

Posted:1 day ago| Platform: Naukri logo

Apply

Work Mode

Work from Office

Job Type

Full Time

Job Description


  
The candidate must possess knowledge relevant to the functional area, and act as a subject matter expert in providing advice in the area of expertise, and also focus on continuous improvement for maximum efficiency. It is vital to focus on the high standard of delivery excellence, provide top-notch service quality and develop successful long-term business partnerships with internal/external customers by identifying and fulfilling customer needs. He/she should be able to break down complex problems into logical and manageable parts in a systematic way, and generate and compare multiple options, and set priorities to resolve problems.    The ideal candidate must be proactive, and go beyond expectations to achieve job results and create new opportunities. He/she must positively influence the team, motivate high performance, promote a friendly climate, give constructive feedback, provide development opportunities, and manage career aspirations of direct reports. Communication skills are key here, to explain organizational objectives, assignments, and the big picture to the team, and to articulate team vision and clear objectives.
 Senior Process Manager Roles and responsibilities: 
We are seeking a talented and motivated Data Engineer to join our dynamic team. The ideal candidate will have a deep understanding of data integration processes and experience in developing and managing data pipelines using Python, SQL, and PySpark within Databricks. You will be responsible for designing robust backend solutions, implementing CI/CD processes, and ensuring data quality and consistency.
  •  Data Pipeline Development: 
  • Using Data bricks features to explore raw datasets and understand their structure.
  • Creating and optimizing Spark-based workflows.
  • Create end-to-end data processing pipelines, including ingesting raw data, transforming it, and running analyses on the processed data.
  • Create and maintain data pipelines using Python and SQL.
  •  Solution Design and Architecture: 
  • Design and architect backend solutions for data integration, ensuring they are robust, scalable, and aligned with business requirements.
  • Implement data processing pipelines using various technologies, including cloud platforms, big data tools, and streaming frameworks.
  •  Automation and Scheduling: 
  • Automate data integration processes and schedule jobs on servers to ensure seamless data flow.
  •  Data Quality and Monitoring: 
  • Develop and implement data quality checks and monitoring systems to ensure data accuracy and consistency.
  •  CI/CD Implementation: 
  • Use Jenkins and Bit bucket to create and maintain metadata and job files.
  • Implement continuous integration and continuous deployment (CI/CD) processes in both development and production environments to deploy data pipelines efficiently.
  •  Collaboration and Documentation: 
  • Work effectively with cross-functional teams, including software engineers, data scientists, and DevOps, to ensure successful project delivery.
  • Document data pipelines and architecture to ensure knowledge transfer and maintainability.
  • Participate in stakeholder interviews, workshops, and design reviews to define data models, pipelines, and workflows.

  •  Technical and Functional
    Skills:
  •  
  •  Education and Experience: 
  • Bachelors Degree with 7+ years of experience, including at least 3+ years of hands-on experience in SQL/ and Python.
  •  Technical Proficiency: 
  • Proficiency in writing and optimizing SQL queries in MySQL and SQL Server.
  • Expertise in Python for writing reusable components and enhancing existing ETL scripts.
  • Solid understanding of ETL concepts and data pipeline architecture, including CDC, incremental loads, and slowly changing dimensions (SCDs).
  • Hands-on experience with PySpark.
  • Knowledge and experience with using Data bricks will be a bonus.
  • Familiarity with data warehousing solutions and ETL processes.
  • Understanding of data architecture and backend solution design.
  •  Cloud and CI/CD Experience: 
  • Experience with cloud platforms such as AWS, Azure, or Google Cloud.
  • Familiarity with Jenkins and Bit bucket for CI/CD processes.
  •  Additional
    Skills:
  •  
  • Ability to work independently and manage multiple projects simultaneously.
  • Mock Interview

    Practice Video Interview with JobPe AI

    Start Python Interview Now
    cta

    Start Your Job Search Today

    Browse through a variety of job opportunities tailored to your skills and preferences. Filter by location, experience, salary, and more to find your perfect fit.

    Job Application AI Bot

    Job Application AI Bot

    Apply to 20+ Portals in one click

    Download Now

    Download the Mobile App

    Instantly access job listings, apply easily, and track applications.

    coding practice

    Enhance Your Python Skills

    Practice Python coding challenges to boost your skills

    Start Practicing Python Now
    eClerx
    eClerx

    IT Services and IT Consulting

    Mumbai Maharashtra

    10001 Employees

    1198 Jobs

      Key People

    • Mukesh Gupta

      Chief Executive Officer
    • Ankush Tiwari

      Chief Financial Officer

    RecommendedJobs for You

    Mumbai, Maharashtra, India