Home
Jobs

Data Analyst- SAS & Pyspark (Bangalore)

4 - 7 years

8 - 15 Lacs

Posted:5 hours ago| Platform: Naukri logo

Apply

Work Mode

Hybrid

Job Type

Full Time

Job Description

Job Title:

  • Location:

     Bangalore
  • Job Type:

     Full-Time
  • Responsibilities:

Design and Develop Data Pipelines:

0. Create and maintain scalable data pipelines to process large volumes of data efficiently.

0. Implement ETL (Extract, Transform, Load) processes to integrate data from various sources into a unified data platform.

Data Integration:

1. Work with technologies such as Apache Hadoop, Spark, Kafka, and other big data tools to integrate and process data.

1. Ensure seamless data flow between different systems and platforms.

Data Storage Management:

2. Optimize and manage data storage solutions, including data lakes and data warehouses.

2. Ensure data is stored in a manner that is both cost-effective and performant.

Data Quality and Integrity:

3. Implement data validation and cleansing processes to ensure high data quality.

3. Monitor data pipelines to detect and resolve data quality issues promptly.

Collaboration with Stakeholders:

4. Work closely with data scientists, analysts, and other stakeholders to understand their data needs.

4. Translate business requirements into technical specifications and data solutions.

Performance Monitoring and Troubleshooting:

5. Monitor the performance of data pipelines and systems to ensure reliability and efficiency.

5. Troubleshoot and resolve any issues that arise in the data processing workflow.

Documentation and Best Practices:

6. Develop and maintain comprehensive documentation for data engineering processes and systems.

6. Establish and promote best practices for data engineering within the team.

Stay Current with Industry Trends:

7. Keep up-to-date with the latest trends and advancements in big data technologies and analytics.

7. Evaluate and recommend new tools and technologies to improve data processing and analysis capabilities.

Data Security and Compliance:

8. Implement and enforce data security measures to protect sensitive information.

8. Ensure compliance with relevant data protection regulations and standards.

Support Real-Time Data Processing:

9. Develop solutions for real-time data processing and stream analytics to support immediate data insights.

9. Utilize tools and frameworks designed for real-time data handling.

Continuous Improvement:

10. Continuously seek opportunities to improve data engineering processes and systems.

10. Participate in code reviews and provide constructive feedback to peers.

  • Qualifications:

  • Bachelors or masters degree in computer science, Engineering, or a related field.
  • Proven experience as a Data Analyst/ Engineer or in a similar role, with a focus on big data technologies.
  • Strong proficiency in programming languages such as Python (with Pandas, NumPy), Pyspark, SAS, Java, or Scala.
  • Experience with big data frameworks and tools (e.g., Hadoop, Spark, Kafka, Hive, PySpark).
  • Familiarity with cloud platforms (e.g., AWS, Azure, Google Cloud) and their big data services.
  • Solid understanding of data modeling, ETL processes, and data warehousing concepts.
  • Excellent problem-solving skills and attention to detail.
  • Strong communication and collaboration skills.

  • Technical Skills:

  • Proficiency in SQL and NoSQL databases (e.g., MySQL, PostgreSQL, MongoDB, Cassandra).
  • Experience with data visualization tools (e.g., Tableau, Power BI).
  • Knowledge of containerization and orchestration tools (e.g., Docker, Kubernetes).
  • Familiarity with version control systems (e.g., Git).
  • Experience with workflow orchestration tools (e.g., Apache Airflow, Luigi).
  • Understanding of data governance and compliance standards.
  • Experience with real-time data processing and stream analytics.
  • Knowledge of machine learning and data science concepts.
  • Certification in big data technologies or cloud platforms.
  • Orchestration Skills:

     Proficiency in using orchestration tools like Apache Airflow, Luigi, and Kubernetes to automate and manage complex data workflows.
  • Cloud Exposure:

     Hands-on experience with cloud platforms such as AWS, Azure, or Google Cloud, including their big data and analytics services (e.g., AWS Redshift, Azure Synapse, Google Big Query).
  • Soft Skills:

  • Analytical Thinking:

     Ability to analyze complex data and derive actionable insights.
  • Problem-Solving:

     Strong aptitude for identifying issues and developing effective solutions.
  • Communication:

     Excellent verbal and written communication skills to convey technical concepts to non-technical stakeholders.
  • Teamwork:

     Ability to work collaboratively in a team environment and build positive relationships with colleagues.
  • Adaptability:

     Flexibility to adapt to changing priorities and technologies in a fast-paced environment.
  • Attention to Detail:

     Meticulous attention to detail to ensure data accuracy and quality.
  • Time Management:

     Strong organizational skills to manage multiple tasks and meet deadlines.
  • Continuous Learning:

     Eagerness to stay updated with the latest industry trends and technologies.

Qualification Criteria

  • Bachelor’s or master’s degree in computer science, Engineering, or a related field.
  • Proven experience as a Data Analyst/ Engineer or in a similar role, with a focus on big data technologies.
  • Strong proficiency in programming languages such as Python (with Pandas, NumPy), Pyspark, SAS, Java, or Scala.
  • Experience with big data frameworks and tools (e.g., Hadoop, Spark, Kafka, Hive, PySpark).
  • Familiarity with cloud platforms (e.g., AWS, Azure, Google Cloud) and their big data services.
  • Solid understanding of data modeling, ETL processes, and data warehousing concepts.
  • Excellent problem-solving skills and attention to detail.
  • Strong communication and collaboration skills.

Mock Interview

Practice Video Interview with JobPe AI

Start PySpark Interview
cta

Start Your Job Search Today

Browse through a variety of job opportunities tailored to your skills and preferences. Filter by location, experience, salary, and more to find your perfect fit.

Job Application AI Bot

Job Application AI Bot

Apply to 20+ Portals in one click

Download Now

Download the Mobile App

Instantly access job listings, apply easily, and track applications.

coding practice

Enhance Your Python Skills

Practice Python coding challenges to boost your skills

Start Practicing Python Now
Grant Thornton
Grant Thornton

Accounting

Chicago IL

RecommendedJobs for You