Get alerts for new jobs matching your selected skills, preferred locations, and experience range. Manage Job Alerts
5.0 - 8.0 years
15 - 30 Lacs
hyderabad, pune, bengaluru
Hybrid
Seeking Big Data Engineer with 4+ yrs in PySpark, Azure (Data Lake, Synapse, ADF, Databricks), and Medallion Architecture. Build scalable pipelines, optimize ETL, ensure data security, and collaborate on analytics solutions. Azure certs preferred.
Posted 8 hours ago
3.0 - 5.0 years
15 - 30 Lacs
bengaluru
Work from Office
About the Role We are looking for a skilled Data Engineer with 35 years of experience to design,implement, and maintain a Medallion Architecture on Azure Data Platform. The ideal candidate should have strong hands-on experience with SQL Server, PySpark, and Power BI, and a solid understanding of data modelling and data pipelines. Key Responsibilities • Design, build, and optimize scalable data pipelines and dataflows in Azure Data Factory / Azure Synapse following the Medallion Architecture (Bronze, Silver, Gold layers). • Develop and manage data ingestion processes from diverse sources into Azure Data Lake. • Implement and optimize data transformations using PySpark / Databricks. • Work with SQL Server to build and maintain structured data storage, queries, and stored procedures. • Develop and support Power BI dashboards to deliver business insights. • Ensure best practices in data governance, data quality, and security. • Collaborate with business stakeholders, analysts, and other engineers to understand data needs and deliver solutions. • Monitor and troubleshoot data pipelines, ensuring high availability and performance. Required Skills & Qualifications • 3–5 years of experience as a Data Engineer or in a related role. • Strong knowledge of Azure Data Services (Azure Fabrics,Azure Data Factory, Azure Data Lake, Synapse, Databricks). • Hands-on experience with PySpark for ETL and data transformation. • Proficiency in SQL Server (queries, stored procedures, optimization). • Experience building and maintaining Power BI dashboards and data models. • Solid understanding of Medallion Architecture (Bronze, Silver, Gold) for data engineering. • Good knowledge of data modelling, data warehousing concepts, and performance tuning. • Experience with Git / DevOps CI/CD for data projects is a plus. Good to Have • Knowledge of Delta Lake and Parquet formats. • Familiarity with Azure Event Hub / Kafka for real-time streaming. • Experience with Python and REST APIs for data integration. • Exposure to cloud cost optimization and monitoring tools. Education • Bachelor’s or Master’s degree in Computer Science, Information Technology, Data Engineering, or related field.
Posted 6 days ago
8.0 - 11.0 years
0 - 26 Lacs
Bengaluru
Work from Office
Roles and Responsibilities : Design and develop data models for large-scale datasets using Snowflake, ensuring scalability, performance, and maintainability. Collaborate with cross-functional teams to understand business requirements and translate them into technical specifications for data modeling. Develop complex SQL queries to extract insights from the designed databases and provide recommendations for improvement. Participate in code reviews to ensure adherence to coding standards and best practices. Job Requirements : 8-11 years of experience in Data Modeling with expertise in Snowflake. Strong understanding of database design principles, including normalized/denormalized star/snowflake schema designs. Proficiency in writing complex SQL queries for querying large datasets. Experience working on big data projects involving high-volume transactions processing.
Posted 1 month ago
6.0 - 10.0 years
18 - 30 Lacs
Noida
Remote
Alcor Solutions is a leading digital transformation and cloud consulting firm, enabling enterprises to leverage modern data platforms, cloud-native solutions, and emerging technologies to drive innovation and business outcomes. We are seeking an Azure Databricks Lead Engineer to architect, administer, and scale our enterprise Databricks environment while establishing best practices and governance across the organization. Role Overview The Azure Databricks Lead Engineer will serve as the subject matter expert (SME) for our Databricks platform, overseeing its architecture, administration, and optimization across the enterprise. This individual will be responsible for designing the Databricks Medallion architecture, implementing Unity Catalog governance, managing cluster policies, and driving the adoption of Databricks best practices. The role requires a combination of hands-on administration and strategic platform design, ensuring Databricks is secure, scalable, and fully aligned with enterprise data initiatives. Design and implement Databricks Medallion architecture for enterprise-wide data usage. Define workspace strategy , including design, configuration, and organization for multi-team and multi-project usage. Establish best practices for data ingestion, transformation, and consumption using Databricks. Administer and optimize the Azure Databricks platform (clusters, pools, compute policies). Set up and manage Unity Catalog , including access control, catalog/schema security, and governance for PII and PHI data. Create and enforce cluster management policies , workspace permissions, and environment segregation (dev/test/prod). Monitor platform health, usage, and costs ; implement scaling and tuning strategies. Collaborate with cloud infrastructure, networking, and security teams to ensure Databricks integrates with enterprise systems and policies. Support data engineers, data scientists, and analytics teams by enabling seamless deployment of applications and pipelines on Databricks. Act as the primary technical advisor and escalation point for Databricks-related issues. Define processes for sizing and provisioning Databricks clusters based on project requirements. Establish governance for non-production and production environments to ensure compliance and cost-efficiency. Build guidelines for version control, Unity Catalog repository configuration, and DevOps pipelines . Serve as the Databricks SME across the enterprise, evangelizing best practices and new features. Mentor internal teams and conduct workshops or training sessions to upskill stakeholders on Databricks usage. Stay current with Databricks and Azure advancements , ensuring the enterprise platform evolves with industry standards. Experience & Qualifications: 7+ years of relevant experience in data engineering, cloud data platforms, or similar roles. Proven track record of Databricks administration and architecture , ideally with formal Medallion architecture experience . Experience in managing large-scale Databricks environments across multiple teams and projects. Deep expertise in Azure Databricks , Unity Catalog , and Delta Lake . Strong knowledge of cluster configuration, compute policies, and workspace design Familiarity with Azure ecosystem (Azure Data Lake, Key Vault, Azure Active Directory). Understanding of data governance, security policies, and compliance standards (handling of PII/PHI ). Experience with CI/CD pipelines , version control (Git) , and infrastructure-as-code practices is desirable.
Posted 1 month ago
5.0 - 6.0 years
20 - 25 Lacs
Chennai
Work from Office
Mandatory requirements : A minimum of 5 years of hands-on Snowflake experience Overall experience minimum 6 Years Proven expertise in query and performance optimization Strong background in medallion architecture and star schema design Demonstrated experience building scalable data warehouses (not limited to ingesting data from flat files) Good To Have: SnowPro Core Certification SnowPro Advanced certifications in Data Engineering, Data Analysis, or Architecture are highly desirable
Posted 1 month ago
8 - 13 years
25 - 30 Lacs
Chennai, Bangalore Rural, Hyderabad
Work from Office
Company Name: (One of the Leading General Insurance company in India,( Chennai) Industry: General Insurance Years of Experience 7+ Years Location -Chennai Mail at manjeet.kaur@mounttalent.com Purpose The candidate is responsible for designing, creating, deploying, and maintaining an organization's data architecture. To ensure that the organization's data assets are managed effectively and efficiently, and that they are used to support the organization's goals and objectives. Responsible for ensuring that the organization's data is secure, and that appropriate data governance policies and procedures are in place to protect the organization's data assets. Key Responsibilities Responsibilities will include but will not be restricted to: Responsible for designing and implementing a data architecture that supports the organization's business goals and objectives. Developing data models, defining data standards and guidelines, and establishing processes for data integration, migration, and management. Create and maintain data dictionaries, which are a comprehensive set of data definitions and metadata that provide context and understanding of the organization's data assets. Ensure that the data is accurate, consistent, and reliable across the organization. This includes establishing data quality metrics and monitoring data quality on an ongoing basis. Organization's data is secure, and that appropriate data governance policies and procedures are in place to protect the organization's data assets. Work closely with other IT professionals, including database administrators, data analysts, and developers, to ensure that the organization's data architecture is integrated and aligned with other IT systems and applications. Stay up to date with new technologies and trends in data management and architecture and evaluate their potential impact on the organization's data architecture. Communicate with stakeholders across the organization to understand their data needs and ensure that the organization's data architecture is aligned with the organization's strategic goals and objectives. Technical requirements Bachelor's or masters degree in Computer Science or a related field. Certificates in Database Management will be preferred. Expertise in data modeling and design, including conceptual, logical, and physical data models, and must be able to translate business requirements into data models. Proficient in a variety of data management technologies, including relational databases, NoSQL databases, data warehouses, and data lakes. Expertise in ETL processes, including data extraction, transformation, and loading, and must be able to design and implement data integration processes. Experience with data analysis and reporting tools and techniques and must be able to design and implement data analysis and reporting processes. Familiar with industry-standard data architecture frameworks, such as TOGAF and Zachman, and must be able to apply them to the organization's data architecture. Familiar with cloud computing technologies, including public and private clouds, and must be able to design and implement data architectures that leverage cloud computing. Qualitative Requirements Able to effectively communicate complex technical concepts to both technical and non-technical stakeholders. Strong analytical and problem-solving skills. Must be able to inspire and motivate their team to achieve organizational goal. Following skills can be deemed good to have but not necessary: Databricks, Snowflake, Redshift, Data Mesh, Medallion, Lambda
Posted 4 months ago
Upload Resume
Drag or click to upload
Your data is secure with us, protected by advanced encryption.
Browse through a variety of job opportunities tailored to your skills and preferences. Filter by location, experience, salary, and more to find your perfect fit.
We have sent an OTP to your contact. Please enter it below to verify.
Accenture
71627 Jobs | Dublin
Wipro
26798 Jobs | Bengaluru
Accenture in India
22262 Jobs | Dublin 2
EY
20323 Jobs | London
Uplers
14624 Jobs | Ahmedabad
IBM
13848 Jobs | Armonk
Bajaj Finserv
13848 Jobs |
Accenture services Pvt Ltd
13066 Jobs |
Amazon
12516 Jobs | Seattle,WA
Capgemini
12337 Jobs | Paris,France