SoftwareEngineer-IT Rpt&Anltcs

3 years

0 Lacs

Posted:18 hours ago| Platform: SimplyHired logo

Apply

Work Mode

On-site

Job Type

Full Time

Job Description

We are seeking a highly skilled and motivated Lead Data & Analytics Engineer to join our dynamic team. This role is crucial in transforming raw data into actionable insights, enabling data-driven decision-making across the organization. The ideal candidate will possess a strong blend of business intelligence development expertise (Power BI or Qlik Sense), robust data engineering capabilities with a focus on Google Cloud Platform (GCP), and a solid understanding of DevOps practices for continuous delivery and operational excellence. You will be instrumental in designing, building, and maintaining scalable data solutions, from data ingestion and transformation to visualization and deployment.


Must-Have Qualifications:

  • Bachelor's degree in Computer Science, Information Systems, Engineering, or a related quantitative field.
  • 3-5+ years of hands-on experience in Business Intelligence development using either Power BI or Qlik Sense .

* Proficiency in data modeling, dashboarding, reporting, and advanced features (e.g., DAX for Power BI, scripting/set analysis for Qlik Sense).

  • 3-5+ years of strong experience in Data Engineering , with significant hands-on experience on Google Cloud Platform (GCP) or any leading cloud platform.

* Expertise with core GCP data services (e.g., BigQuery, Cloud Dataflow, Cloud Storage, Cloud SQL, Cloud Pub/Sub) or equivalent services in any leading cloud platform.

* Strong proficiency in SQL and at least one programming language like Python, Java, or Scala for data manipulation and pipeline development.

* Extensive experience designing and implementing ETL/ELT processes and managing data warehousing solutions.

  • Solid understanding and practical experience with DevOps principles and practices

* Experience with CI/CD tools (e.g., Jenkins, GitLab CI, Cloud Build) and version control systems (e.g., Git).

* Familiarity with Infrastructure as Code (IaC) tools (e.g., Terraform, Ansible).

* Experience with containerization (Docker) and orchestration (Kubernetes, Astronomer etc.).

  • Strong analytical, problem-solving, and critical thinking skills with meticulous attention to detail.
  • Excellent communication and interpersonal skills, with the ability to articulate complex technical concepts to both technical and non-technical stakeholders.
  • Experience working in an Agile development environment.

Nice-to-Have Qualifications:

  • GCP Data Engineer or DevOps certification.
  • Experience with other cloud platforms (AWS, Azure).
  • Familiarity with big data technologies like Apache Spark or Hadoop ecosystems.
  • Experience with data governance, security, and compliance principles (e.g., GDPR, HIPAA).
  • Knowledge of streaming data processing (e.g., Apache Kafka, GCP Pub/Sub).
  • Experience with other BI tools (e.g., Tableau, Looker).
  • Prior experience mentoring junior team members.

Business Intelligence Development (Power BI / Qlik Sense):

* Design, develop, and maintain interactive and insightful dashboards, reports, and data visualizations using Power BI or Qlik Sense.

* Collaborate with business stakeholders to gather, understand, and translate complex business requirements into technical specifications for reporting and analytics solutions.

* Develop custom DAX calculations (for Power BI) or Qlik Sense scripts, including set analysis and QVD generation, to support advanced analytics and complex business scenarios.

* Optimize Power BI or Qlik Sense applications for performance and usability, ensuring data accuracy and consistency across all BI solutions.

* Implement data security measures, including row-level security (RLS) and data encryption, within BI reports and dashboards.

Data Engineering :

* Design, build, and maintain scalable, efficient, and reliable data pipelines and architectures on Google Cloud Platform (GCP).

* Utilize GCP services such as BigQuery, Cloud Dataflow, Cloud Storage, Cloud Pub/Sub, Dataform, Cloud run and Artifact registry for data ingestion, transformation, and storage.

* Develop and optimize Extract, Transform, Load (ETL/ELT) processes to integrate data from various internal and external sources, both structured and unstructured.

* Implement and manage data warehousing concepts, data modeling (relational and dimensional), and database design.

* Ensure data quality, integrity, and governance throughout the data lifecycle.

* Monitor, troubleshoot, and resolve issues in data systems and pipelines to ensure continuous operation and performance.

* Implement and manage CI/CD pipelines for automated testing, deployment, and release of data engineering and BI solutions on GCP.

* Utilize Infrastructure as Code (IaC) tools such as Terraform or Ansible to provision and manage GCP resources.

* Employ containerization technologies like Docker and orchestration tools like Kubernetes for deploying data applications and services.

* Automate repetitive tasks and operational processes using scripting languages (e.g., Python, Bash).

* Monitor system performance, resource utilization, and implement logging and alerting mechanisms for data infrastructure.

* Collaborate with development and operations teams to streamline workflows and ensure high availability, reliability, and scalability of cloud-based systems.

Mock Interview

Practice Video Interview with JobPe AI

Start DevOps Interview
cta

Start Your Job Search Today

Browse through a variety of job opportunities tailored to your skills and preferences. Filter by location, experience, salary, and more to find your perfect fit.

Job Application AI Bot

Job Application AI Bot

Apply to 20+ Portals in one click

Download Now

Download the Mobile App

Instantly access job listings, apply easily, and track applications.

coding practice

Enhance Your Java Skills

Practice Java coding challenges to boost your skills

Start Practicing Java Now

RecommendedJobs for You