Job Title: Data Analyst Position Summary: We are seeking a detail-oriented and technically proficient
Data Analyst to work with diverse and complex datasets to uncover insights, support data-driven decision-making, and contribute to enterprise data initiatives. The ideal candidate will have hands-on experience with SQL, Snowflake, APIs, and ETL pipelines, as well as a strong understanding of data quality, governance, and integration patterns.
Required Qualifications: - 7+ years of professional experience in a data analyst or data engineering support role.
- Strong expertise in SQL for data extraction, transformation, and validation.
- Experience working with Snowflake and familiarity with data warehousing concepts.
- Solid understanding of ETL processes and working knowledge of data pipeline concepts.
- Familiarity with REST API integration and JSON/XML data structures.
- Experience with data profiling, data quality checks , and normalization techniques.
- Proficiency in Python and Java for custom data solutions and automation.
- Experience working with Qlik Replicate or similar real-time replication tools.
- Hands-on exposure to YugabyteDB or other distributed SQL databases.
Nice to Have / Preferred Skills: - Understanding of AI/ML concepts and how they apply to data preparation and modeling.
- Experience with CI/CD pipelines for automated testing and deployment of analytics solutions.
- Familiarity with Rocket ETL or similar ETL automation tools.
- Exposure to data visualization tools such as Sigma, Tableau, or Power BI.
- Knowledge of cloud platforms such as AWS, Azure, or GCP .
Key Responsibilities: - Analyze and interpret complex structured and semi-structured datasets from multiple internal and external data sources.
- Write advanced SQL queries to extract, manipulate, validate, and summarize large volumes of data efficiently.
- Perform data profiling, cleansing, and normalization to improve data integrity and consistency.
- Use Snowflake for querying, transforming, and analyzing data in a cloud-based data warehouse.
- Integrate data from and with RESTful APIs and third-party data providers.
- Collaborate with data engineering teams on ETL pipeline validation and enhancements.
- Ensure compliance with data governance standards , data security, and privacy policies.
- Use Qlik Replicate for supporting real-time and near-real-time data ingestion workflows.
- Access and query data in distributed SQL databases like YugabyteDB .
- Develop scripts in Python and Java for data processing, automation, and validation tasks.