Work from Office
Full Time
We are seeking a talented and detail-oriented Data Developer to design, develop, and maintain data solutions that support analytics, reporting, and business decision-making You will play a key role in building and optimizing data pipelines, integrating disparate data sources, and ensuring high-quality data delivery across systems The ideal candidate has strong technical skills, a passion for data, and experience working in modern data environments, Key Responsibilities Develop, optimize, and maintain data pipelines and ETL/ELT processes to ingest, transform, and load data from various sources, Build and maintain data models, data warehouses, and data marts to support business intelligence and analytics needs, Collaborate with data analysts, engineers, and business users to understand data requirements and deliver fit-for-purpose solutions, Write efficient SQL queries and scripts for data manipulation, cleansing, and analysis, Ensure data quality, consistency, and integrity across platforms and applications, Support data integration efforts between internal systems and third-party platforms via APIs or file transfers, Participate in performance tuning and troubleshooting of data workflows and queries, Document data solutions, processes, and data definitions to support data governance and maintain transparency, Stay updated with industry trends and recommend improvements to the existing data architecture and practices, Requirements Experience: 3+ years of experience in ETL development, data engineering, or a similar role, Hands-on experience with ETL tools and frameworks (e-g , Apache NiFi, Snowflake etc), Strong SQL skills and familiarity with scripting languages such as Python or Shell, Experience working with relational databases (e-g, SQL Server, PostgreSQL, MySQL) and data warehouses, Familiarity with version control systems (e-g, Git) and CI/CD pipelines, Skills Solid understanding of data modelling concepts (e-g , star/snowflake schema, normalization), Good knowledge of cloud platforms and their data services, Strong analytical thinking and problem-solving skills, Effective communication skills and the ability to work collaboratively in a team environment, Preferred Qualifications Experience with big data tools and platforms (e-g, Spark, Hadoop, Databricks), Familiarity with data orchestration tools like Apache Airflow, Understanding of data governance, security, and compliance principles, Background in agile software development and DevOps practices check(event) ;
Facctum
Upload Resume
Drag or click to upload
Your data is secure with us, protected by advanced encryption.
Bengaluru
4.0 - 7.0 Lacs P.A.
Noida, Uttar Pradesh, India
4.0 - 5.5 Lacs P.A.
Pune/Pimpri-Chinchwad Area
Salary: Not disclosed
Hyderābād
Experience: Not specified
Salary: Not disclosed
Pune, Maharashtra, India
Salary: Not disclosed
Hyderabad
12.0 - 36.0 Lacs P.A.
Bengaluru
4.0 - 7.0 Lacs P.A.
Mumbai, Nagpur, Thane, Nashik, Pune, Aurangabad
50.0 - 55.0 Lacs P.A.
Hyderabad
10.0 - 15.0 Lacs P.A.
6.0 - 10.0 Lacs P.A.