Jobs
Interviews

3 Shellscripting Jobs

Setup a job Alert
JobPe aggregates results for easy application access, but you actually apply on the job portal directly.

6.0 - 8.0 years

8 - 18 Lacs

Pune

Hybrid

Job Title: Lead ETL Developer Job Location: Pune Job Description: Company Introduction Join Nitor Infotech, an Ascendion company, where we harness data to drive impactful solutions. Our innovative team is dedicated to excellence in data processing and analytics, making a significant difference in the retail domain. Be part of a collaborative environment that values your expertise and contributions. Job Overview We are seeking an ETL Developer with expertise in Advanced SQL, Python, and Shell Scripting. This full-time position reports to the Data Engineering Manager and is available in a hybrid work model. This is a replacement position within the SRAI - EYC Implementation team. Key Responsibilities Design and develop ETL processes for data extraction, transformation, and loading. Utilize Advanced SQL for data processing and analysis. Implement data processing solutions using Python and Shell Scripting. Collaborate with cross-functional teams to understand data requirements. Maintain and optimize data pipelines for performance and reliability. Provide insights and analysis to support business decisions. Ensure data quality and integrity throughout the ETL process. Stay updated on industry trends and best practices in data engineering. Must-Have Skills and Qualifications 7-8 years of experience as an ETL Developer. Expertise in Advanced SQL for data manipulation and analysis. Proficient in Python and Shell Scripting. Foundational understanding of Databricks and Power BI. Strong logical problem-solving skills. Experience in data processing and transformation. Understanding of the retail domain is a plus. Good-to-Have Skills and Qualifications Familiarity with cloud data platforms (AWS, Azure). Knowledge of data warehousing concepts. Experience with data visualization tools. Understanding of Agile methodologies. What We Offer Competitive salary and comprehensive benefits package. Opportunities for professional growth and advancement. Collaborative and innovative work environment. Flexible work arrangements. Impactful work that drives industry change. DEI Statement At Nitor Infotech, we embrace diversity and inclusion. We actively foster an environment where all voices are heard and valued. ISMS Statement Nitor Infotech maintains ISO 27001 certification. All employees must adhere to our information security policies.

Posted 5 days ago

Apply

2.0 - 7.0 years

4 - 7 Lacs

Hyderabad, Bengaluru

Work from Office

Installation, configuration, and maintenance of Linux servers RedHat, CentOS, Ubuntu Monitor system performance, troubleshoot issues, and perform regular system tuning Manage user accounts, permissions, and security policies Automate routine tasks Required Candidate profile Manage patching, updates, and software installations Work closely with DevOps and development teams to support deployment pipelines Ensure system security, firewall configurations Document processes Perks and benefits Perks and Benefits

Posted 2 months ago

Apply

8.0 - 12.0 years

8 - 16 Lacs

Hyderabad, Bangalore Rural, Bengaluru

Hybrid

We are hiring for a Databricks Admin role to fulfill our organisation. Total Experience: 8+yrs Relevant Experience: 5+yrs Mandatory Skills: Databricks Admin, Python, Shell Scripting Location: Hyderabad(Preferred),Bangalore Role & responsibilities Workspace Management: Create and manage Databricks workspaces, ensuring proper configuration and access control. User & Identity Management: Administer user roles, permissions, and authentication mechanisms. Cluster Administration: Configure, monitor, and optimize Databricks clusters for efficient resource utilization. Security & Compliance: Implement security best practices, including data encryption, access policies, and compliance adherence. Performance Optimization: Troubleshoot and resolve performance issues related to Databricks workloads. Integration & Automation: Work with cloud platforms (AWS, Azure, GCP) to integrate Databricks with other services. Monitoring & Logging: Set up monitoring tools and analyze logs to ensure system health. Data Governance: Manage Unity Catalog and other governance tools for structured data access. Collaboration: Work closely with data engineers, analysts, and scientists to support their workflows. Qualifications: Proficiency in Python or Scala for scripting and automation. Knowledge of cloud platforms (AWS). Familiarity with Databricks Delta Lake and MLflow. Understanding of ETL processes and data warehousing concepts. Strong problem-solving and analytical skills.

Posted 2 months ago

Apply
cta

Start Your Job Search Today

Browse through a variety of job opportunities tailored to your skills and preferences. Filter by location, experience, salary, and more to find your perfect fit.

Job Application AI Bot

Job Application AI Bot

Apply to 20+ Portals in one click

Download Now

Download the Mobile App

Instantly access job listings, apply easily, and track applications.

Featured Companies