Get alerts for new jobs matching your selected skills, preferred locations, and experience range. Manage Job Alerts
6.0 - 10.0 years
0 - 0 Lacs
hyderabad, telangana
On-site
You will be joining QTek Digital, a leading data solutions provider known for its expertise in custom data management, data warehouse, and data science solutions. Our team of dedicated data professionals, including data scientists, data analysts, and data engineers, collaborates to address present-day challenges and pave the way for future innovations. At QTek Digital, we value our employees and focus on fostering engagement, empowerment, and continuous growth opportunities. As a BI ETL Engineer at QTek Digital, you will be taking on a full-time remote position. Your primary responsibilities will revolve around tasks such as data modeling, applying analytical skills, implementing data warehouse solutions, and managing Extract, Transform, Load (ETL) processes. This role demands strong problem-solving capabilities and the capacity to work autonomously. To excel in this role, you should ideally possess: - 6-9 years of hands-on experience in ETL and ELT pipeline development using tools like Pentaho, SSIS, FiveTran, Airbyte, or similar platforms. - 6-8 years of practical experience in SQL and other data manipulation languages. - Proficiency in Data Modeling, Dashboard creation, and Analytics. - Sound knowledge of data warehousing principles, particularly Kimball design. - Bonus points for familiarity with Pentaho and Airbyte administration. - Demonstrated expertise in Data Modeling, Dashboard design, Analytics, Data Warehousing, and ETL procedures. - Strong troubleshooting and problem-solving skills. - Effective communication and collaboration abilities. - Capability to operate both independently and as part of a team. - A Bachelor's degree in Computer Science, Information Systems, or a related field. This position is based in our Hyderabad office, offering an attractive compensation package ranging from INR 5-19 Lakhs, depending on various factors such as your skills and prior experience. Join us at QTek Digital and be part of a dynamic team dedicated to shaping the future of data solutions.,
Posted 1 day ago
6.0 - 10.0 years
0 Lacs
karnataka
On-site
Job Description: As a Snowflake Admin with 6+ years of experience and the ability to join immediately, you will be responsible for administering and managing Snowflake environments. This includes configuring, ensuring security, and conducting maintenance tasks. Your role will involve monitoring and optimizing Snowflake performance, storage usage, and query efficiency to enhance overall system functionality. In this position, you will be required to implement and manage role-based access control (RBAC) and data security policies to safeguard sensitive information. Additionally, you will set up and oversee data sharing, data replication, and virtual warehouses to support various data operations effectively. You will be expected to automate administrative tasks using SQL, Snowflake CLI, or scripting languages such as Python and Bash. Your proficiency in these tools will be essential in streamlining processes and improving efficiency within the Snowflake environment. Furthermore, providing support for data integration tools and pipelines like Fivetran, dbt, Informatica, and Airflow will be part of your responsibilities. Key Skills: - Snowflake Admin Industry Type: IT/ Computers - Software Functional Area: Not specified Required Education: Bachelor Employment Type: Full Time, Permanent If you are looking for a dynamic opportunity to utilize your expertise in Snowflake administration, apply now with Job Code: GO/JC/668/2025. Join our team and work alongside our Recruiter, Christopher, in a contract hiring role.,
Posted 1 day ago
10.0 - 14.0 years
0 Lacs
delhi
On-site
As a Partner Solution Engineer at Snowflake, you will play a crucial role in technically onboarding and enabling partners to re-platform their Data and AI applications onto the Snowflake AI Data Cloud. Collaborating with partners to develop Snowflake solutions in customer engagements, you will work with them to create assets and demos, build hands-on POCs, and pitch Snowflake solutions. Additionally, you will assist Solution Providers/Practice Leads with the technical strategies that enable them to sell their offerings on Snowflake. Your responsibilities will include keeping partners up to date on key Snowflake product updates and future roadmaps to help them represent Snowflake to their clients about the latest technology solutions and benefits. Running technical enablement programs to provide best practices and solution design workshops to help partners create effective solutions will also be part of your role. Success in this position will require you to drive strategic engagements by quickly grasping new concepts and articulating their business value. You will showcase the impact of Snowflake through compelling customer success stories and case studies, demonstrating a strong understanding of how partners make revenue through the industry priorities and complexities they face. Preferred skill sets and experiences for this role include having a total of 10+ years of relevant experience, experience working with Tech Partners, ISVs, and System Integrators (SIs) in India, and developing data domain thought leadership within the partner community. You should also have presales or hands-on experience with Data Warehouse, Data Lake, or Lakehouse platforms, as well as experience with partner integration ecosystems like Alation, FiveTran, Informatica, dbtCloud, etc. Having hands-on experience and strong knowledge of Docker and how to containerize Python-based applications, knowledge of Container networking and Kubernetes, and proficiency in Agile development practices and Continuous Integration/Continuous Deployment (CI/CD), including DataOps and MLops are desirable skills. Experience in the AI/ML domain is a plus. Snowflake is rapidly expanding, and as part of the team, you will help enable and accelerate the company's growth. If you share Snowflake's values, challenge ordinary thinking, and push the pace of innovation while building a future for yourself and Snowflake, this role could be the perfect fit for you. Please visit the Snowflake Careers Site for salary and benefits information if the job is located in the United States.,
Posted 1 day ago
4.0 - 8.0 years
0 Lacs
navi mumbai, maharashtra
On-site
You will be part of a data analytics services company that specializes in creating and managing scalable data platforms for a diverse client base. Leveraging cutting-edge technologies, you will provide actionable insights and value through modern data stack solutions. Your responsibilities will include designing, building, and managing customer data platforms independently using Snowflake, dbt, Fivetran, and SQL. Collaborating with clients and internal teams to gather business requirements and translating them into reliable data solutions will be a key aspect of your role. You will also develop and maintain ELT pipelines with Fivetran and dbt for automating data ingestion, transformation, and delivery. Optimizing SQL code and data models for scalability, performance, and cost efficiency in Snowflake will be crucial. Additionally, ensuring data platform reliability, monitoring, and data quality maintenance will be part of your responsibilities. You will also provide technical mentorship and guidance to junior engineers and maintain comprehensive documentation of engineering processes and architecture. The required skills and qualifications for this role include proven hands-on experience with Snowflake, dbt, Fivetran, and SQL. You should have a strong understanding of data warehousing concepts, ETL/ELT best practices, and modern data stack architectures. Experience in working independently and owning project deliverables end-to-end is essential. Familiarity with version control systems like Git and workflow automation tools, along with solid communication and documentation skills, is necessary. You should also be able to interact directly with clients and understand their business requirements. Preferred skills that would be beneficial for this role include exposure to cloud platforms like AWS, GCP, and Azure, knowledge of Python or other scripting languages for data pipelines, and experience with BI/analytics tools such as Tableau, Power BI, and Looker. In return, you will have the opportunity to lead the implementation of state-of-the-art data platforms for global clients in a dynamic, growth-oriented work environment with flexible working arrangements and a competitive compensation package. If you are interested in this opportunity, please submit your resume and a short cover letter detailing your experience with Snowflake, dbt, Fivetran, and SQL.,
Posted 1 day ago
8.0 - 12.0 years
0 Lacs
indore, madhya pradesh
On-site
You are a highly skilled and experienced ETL Developer with expertise in data ingestion and extraction, sought to join our team. With 8-12 years of experience, you specialize in building and managing scalable ETL pipelines, integrating diverse data sources, and optimizing data workflows specifically for Snowflake. Your role will involve collaborating with cross-functional teams to extract, transform, and load large-scale datasets in a cloud-based data ecosystem, ensuring data quality, consistency, and performance. Your responsibilities will include designing and implementing processes to extract data from various sources such as on-premise databases, cloud storage (S3, GCS), APIs, and third-party applications. You will ensure seamless data ingestion into Snowflake, utilizing tools like SnowSQL, COPY INTO commands, Snowpipe, and third-party ETL tools (Matillion, Talend, Fivetran). Developing robust solutions for handling data ingestion challenges such as connectivity issues, schema mismatches, and data format inconsistencies will be a key aspect of your role. Within Snowflake, you will perform complex data transformations using SQL-based ELT methodologies, implement incremental loading strategies, and track data changes using Change Data Capture (CDC) techniques. You will optimize transformation processes for performance and scalability, leveraging Snowflake's native capabilities such as clustering, materialized views, and UDFs. Designing and maintaining ETL pipelines capable of efficiently processing terabytes of data will be part of your responsibilities. You will optimize ETL jobs for performance, parallelism, and data compression, ensuring error logging, retry mechanisms, and real-time monitoring for robust pipeline operation. Your role will also involve implementing mechanisms for data validation, integrity checks, duplicate handling, and consistency verification. Collaborating with stakeholders to ensure adherence to data governance standards and compliance requirements will be essential. You will work closely with data engineers, analysts, and business stakeholders to define requirements and deliver high-quality solutions. Documenting data workflows, technical designs, and operational procedures will also be part of your responsibilities. Your expertise should include 8-12 years of experience in ETL development and data engineering, with significant experience in Snowflake. You should be proficient in tools and technologies such as Snowflake (SnowSQL, COPY INTO, Snowpipe, external tables), ETL Tools (Matillion, Talend, Fivetran), cloud storage (S3, GCS, Azure Blob Storage), databases (Oracle, SQL Server, PostgreSQL, MySQL), and APIs (REST, SOAP for data extraction). Strong SQL skills, performance optimization techniques, data transformation expertise, and soft skills like strong analytical thinking, problem-solving abilities, and excellent communication skills are essential for this role. Location: Bhilai, Indore,
Posted 2 days ago
0.0 years
0 Lacs
Hyderabad, Telangana, India
On-site
Work Location : Hyderabad What Gramener offers you Gramener will offer you an inviting workplace, talented colleagues from diverse backgrounds, career paths, and steady growth prospects with great scope to innovate. We aim to create an ecosystem of easily configurable data applications focused on storytelling for public and private use. Data Architect We are seeking an experienced Data Architect to design and govern scalable, secure, and efficient data platforms in a data mesh environment. You will lead data architecture initiatives across multiple domains, enabling self-serve data products built on Databricks and AWS, and support both operational and analytical use cases. Key Responsibilities Design and implement enterprise-grade data architectures leveraging the medallion architecture (Bronze, Silver, Gold). Develop and enforce data modelling standards, including flattened data models optimized for analytics. Define and implement MDM strategies (Reltio), data governance frameworks (Collibra), and data classification policies. Lead the development of data landscapes, capturing sources, flows, transformations, and consumption layers. Collaborate with domain teams to ensure consistency across decentralized data products in a data mesh architecture. Guide best practices for ingesting and transforming data using Fivetran, PySpark, SQL, and Delta Live Tables (DLT). Define metadata and data quality standards across domains. Provide architectural oversight for data platform development on Databricks (Lakehouse) and AWS ecosystem. Key Skills & Qualifications Must-Have Technical Skills: (Reltio, Colibra, Ataccama, Immuta) Experience in the Pharma domain. Data Modeling (dimensional, flattened, common data model, canonical, and domain-specific, entity-level data understanding from a business process point of view). Master Data Management (MDM) principles and tools (Reltio) (1). Data Governance and Data Classification frameworks (1). Strong experience with Fivetran**, PySpark, SQL, Python. Deep understanding of Databricks (Delta Lake, Unity Catalog, Workflows, DLT) . Experience with AWS services related to data (e.g., S3, Glue, Redshift, IAM, ). Experience on Snowflake. Architecture & Design Proven expertise in Data Mesh or Domain-Oriented Data Architecture. Experience with medallion/lakehouse architecture. Ability to create data blueprints and landscape maps across complex enterprise systems. Soft Skills Strong stakeholder management across business and technology teams. Ability to translate business requirements into scalable data designs. Excellent communication and documentation skills. Preferred Qualifications Familiarity with regulatory and compliance frameworks (e.g., GxP, HIPAA, GDPR). Background in data product building. About Us We consult and deliver solutions to organizations where data is the core of decision-making. We undertake strategic data consulting for organizations, laying out the roadmap for data-driven decision-making. This helps organizations convert data into a strategic differentiator. Through a host of our products, solutions, and Service Offerings, we analyze and visualize large amounts of data. To know more about us visit Gramener Website and Gramener Blog. Apply for this role Apply for this Role Show more Show less
Posted 2 days ago
5.0 - 7.0 years
0 Lacs
Mumbai, Maharashtra, India
On-site
curatAId is seeking a Senior Snowflake Consultant on behalf of our client, a fast-growing organization focused on data- driven innovation. This role combines snowflake expertise with DevOps, DBT, Airflow t o support the development and operation of a modern, cloud-based enterprise data platform. The ideal candidate will be responsible for building and managing data infrastructure, developing scalable data pipelines, implementing data quality and governance frameworks and automating workflows for operational efficiency. To apply for this position, it is mandatory to register on our platform at www.curataid.com and give 10 minutes technical quiz on Snowflake skill. Title: Senior Data Engineer Level: Consultant/Deputy Manager/Manager/Senior Manager Relevant Experience: Minimum of 5+ years of hands-on experience on Snowflake with DevOps, DBT, Airflow Must Have Skill: Data Engineering, Snowflake, DBT, Airflow & DevOps Location: Mumbai, Gurgaon, Bengaluru, Chennai, Kolkata, Bhubaneshwar, Coimbatore, Ahmedabad Qualifications 5+ years of relevant snowflake in a data engineering context. (Must Have) 4+ years of relevant experience in DBT, Airflow & DevOps . (Must Have) Strong hands-on experience with data modelling, data warehousing and building high-volume ETL/ELT pipelines. Must have experience with Cloud Data Warehouses like Snowflake, Amazon Redshift, Google Big Query or Azure Synapse Experience with version control systems (GitHub, BitBucket, GitLab). Strong SQL expertise. Implement best practices for data storage management, security, and retrieval efficiency. Experience with pipeline orchestration tools (Fivetran, Stitch, Airflow, etc.). Coding proficiency in at least one modern programming language (Python, Java, Scala, etc.). Show more Show less
Posted 2 days ago
6.0 - 10.0 years
7 - 10 Lacs
Mumbai, Delhi / NCR, Bengaluru
Work from Office
Data Engineering, AirFlow, Fivetran, CI/CD using We are seeking a Sr. Data Engineer to join our Data Engineering team within our Enterprise Data Insights organization to build data solutions, design and implement ETL/ELT processes and manage our data platform to enable our cross functional stakeholders. As a part of our Corporate Engineering division, our vision is to spearhead technology and data-led solutions and experiences to drive growth & innovation at scale. The ideal candidate will have a strong Data Engineering background, advanced Python knowledge and experience with cloud services and SQL/NoSQL databases. You will work closely with our cross functional stakeholders in Product, Finance and GTM along with Business and Enterprise Technology teams. As a Senior Data Engineer, you will: Collaborating closely with various stakeholders to prioritize requests, identify improvements, and offer recommendations. Taking the lead in analyzing, designing, and implementing data solutions, which involves constructing and designing data models and ETL processes. Cultivating collaboration with corporate engineering, product teams, and other engineering groups. Leading and mentoring engineering discussions, advocating for best practices. Actively participating in design and code reviews. Accessing and exploring third-party data APIs to determine the data required to meet business needs. Ensuring data quality and integrity across different sources and systems. Managing data pipelines for both analytics and operational purposes. Continuously enhancing processes and policies to improve SLA and SOX compliance. You'll be a great addition to the team if you have: Hold a B.S., M.S., or Ph.D. in Computer Science or a related technical field. Possess over 5 years of experience in Data Engineering, focusing on building and maintaining data environments. Demonstrate at least 5 years of experience in designing and constructing ETL/ELT processes, managing data solutions within an SLA-driven environment. Exhibit a strong background in developing data products, APIs, and maintaining testing, monitoring, isolation, and SLA processes. Possess advanced knowledge of SQL/NoSQL databases (such as Snowflake, Redshift, MongoDB). Proficient in programming with Python or other scripting languages. Have familiarity with columnar OLAP databases and data modeling. Experience in building ELT/ETL processes using tools like dbt, AirFlow, Fivetran, CI/CD using GitHub, and reporting in Tableau. Possess excellent communication and interpersonal skills to effectively collaborate with various business stakeholders and translate requirements. Added bonus if you also have: A good understanding of Salesforce & Netsuite systems Experience in SAAS environments Designed and deployed ML models Experience with events and streaming data Location-remote,Delhi NCR,Bengaluru,Chennai,Pune,Kolkata,Ahmedabad,Mumbai, Hyderabad
Posted 3 days ago
6.0 - 10.0 years
7 - 10 Lacs
Mumbai, Delhi / NCR, Bengaluru
Work from Office
Data Engineering, AirFlow, Fivetran, CI/CD using We are seeking a Sr. Data Engineer to join our Data Engineering team within our Enterprise Data Insights organization to build data solutions, design and implement ETL/ELT processes and manage our data platform to enable our cross functional stakeholders. As a part of our Corporate Engineering division, ourvision is to spearhead technology and data-led solutions and experiences to drive growth & innovation at scale. The ideal candidate will have a strong Data Engineering background, advanced Python knowledge and experience with cloud services and SQL/NoSQL databases. You will work closely with our cross functional stakeholders in Product, Finance and GTM along with Business and Enterprise Technology teams. As a Senior Data Engineer, you will: Collaborating closely with various stakeholders to prioritize requests, identify improvements, and offer recommendations. Taking the lead in analyzing, designing, and implementing data solutions, which involves constructing and designing data models and ETL processes. Cultivating collaboration with corporate engineering, product teams, and other engineering groups. Leading and mentoring engineering discussions, advocating for best practices. Actively participating in design and code reviews. Accessing and exploring third-party data APIs to determine the data required to meet business needs. Ensuring data quality and integrity across different sources and systems. Managing data pipelines for both analytics and operational purposes. Continuously enhancing processes and policies to improve SLA and SOX compliance. You'll be a great addition to the team if you have: Hold a B.S., M.S., or Ph.D. in Computer Science or a related technical field. Possess over 5 years of experience in Data Engineering, focusing on building and maintaining data environments. Demonstrate at least 5 years of experience in designing and constructing ETL/ELT processes, managing data solutions within an SLA-driven environment. Exhibit a strong background in developing data products, APIs, and maintaining testing, monitoring, isolation, and SLA processes. Possess advanced knowledge of SQL/NoSQL databases (such as Snowflake, Redshift, MongoDB). Proficient in programming with Python or other scripting languages. Have familiarity with columnar OLAP databases and data modeling. Experience in building ELT/ETL processes using tools like dbt, AirFlow, Fivetran, CI/CD using GitHub, and reporting in Tableau. Possess excellent communication and interpersonal skills to effectively collaborate with various business stakeholders and translate requirements. Added bonus if you also have: A good understanding of Salesforce & Netsuite systems Experience in SAAS environments Designed and deployed ML models Experience with events and streaming data Location- Remote,Delhi NCR,Bengaluru,Chennai,Pune,Kolkata,Ahmedabad,Mumbai, Hyderabad
Posted 3 days ago
15.0 - 19.0 years
0 Lacs
noida, uttar pradesh
On-site
As a Senior AI Architect at Dailoqa, you will play a pivotal role in shaping, designing, and delivering agentic AI solutions that drive real-world business value. You will collaborate with business and technology stakeholders, lead cross-functional teams, and ensure our AI architectures are robust, scalable, and aligned with our vision of combined intelligence and financial inclusion. Agentic AI Solution Design: Collaborate with stakeholders to identify high-impact agentic AI use cases, define success metrics, and determine data requirements tailored to Financial Services clients. Architect and oversee the implementation of end-to-end agentic AI solutions aligned with Dailoqa's strategic objectives and client needs. Leadership & Cross-Functional Collaboration: Lead and mentor cross-functional teams in the development and deployment of scalable agentic AI applications and infrastructures. Work closely with business stakeholders to translate complex requirements into actionable AI architecture and technical roadmaps. Technology Evaluation & Governance: Evaluate, recommend, and integrate advanced AI/ML platforms, frameworks, and technologies that enable agentic AI capabilities. Develop and enforce AI governance frameworks, best practices, and ethical standards, ensuring compliance with industry regulations and responsible AI principles. Performance Optimization & Continuous Improvement: Optimize AI models for performance, scalability, and efficiency, leveraging cloud-native and distributed computing resources. Stay ahead of emerging trends in agentic AI, machine learning, and data science, applying new insights to enhance solution quality and business impact. Technical Leadership & Talent Development: Provide technical leadership, mentorship, and code review for junior and peer team members. Participate in the hiring, onboarding, and development of AI talent, fostering a culture of innovation and excellence. Lead sprint planning, technical assessments, and ensure high standards in code quality and solution delivery. Required Qualifications: - 15+ years of Total experience. 8+ years in machine learning, and data science and more recent experience (4-5 yrs) in gen AI models applying AI to practical, comprehensive technology solutions and AI Consultancy. - Knowledge of basic algorithms, object-oriented and functional design principles, and best-practice patterns. - Experience in implementing GenAI, NLP, computer vision, or other AI frameworks/technologies. Tools & Technology: - LLMs and implementing RAG or different prompt strategies. - Azure OpenAI, Off the shelf Platform native AI tools and Models. - Knowledge of ML pipeline orchestration tools. - Experienced in python; ideally working knowledge of various supporting packages. - Experience in REST API development, NoSQL database design, and RDBMS design and optimizations. - Strong experience in Data engineering and aligned Hyperscale Platforms e.g. Databricks, Synapse, Fivetran etc. Education and Others Skills: - Master's or Ph.D. in Computer Science, Data Science, or related field. - Extensive experience with modern AI frameworks, cloud platforms, and big data technologies. - Strong background in designing and implementing AI solutions for enterprise-level applications. - Proven ability to lead and mentor technical teams. - Excellent communication skills with the ability to explain complex AI concepts to both technical and non-technical audiences. - Deep understanding of AI ethics and responsible AI practices. Working at Dailoqa will provide you with an opportunity to be part of a dynamic and innovative team that values collaboration, innovation, and continuous learning. If you are proactive, adaptable, and passionate about leveraging AI to solve real-world challenges in the financial services industry, then this role might be the perfect fit for you.,
Posted 6 days ago
5.0 - 9.0 years
0 Lacs
pune, maharashtra
On-site
You are a Sr. Data Engineer with over 7 years of experience, specializing in Data Engineering, Python, and SQL. You will be a part of the Data Engineering team in the Enterprise Data Insights organization, responsible for building data solutions, designing ETL/ELT processes, and managing the data platform to support various stakeholders across the organization. Your role is crucial in driving technology and data-led solutions to foster growth and innovation at scale. Your responsibilities as a Senior Data Engineer include collaborating with cross-functional stakeholders to prioritize requests, identify areas for improvement, and provide recommendations. You will lead the analysis, design, and implementation of data solutions, including constructing data models and ETL processes. Furthermore, you will engage in fostering collaboration with corporate engineering, product teams, and other engineering groups, while also leading and mentoring engineering discussions and advocating for best practices. To excel in this role, you should possess a degree in Computer Science or a related technical field and have a proven track record of over 5 years in Data Engineering. Your expertise should include designing and constructing ETL/ELT processes, managing data solutions within an SLA-driven environment, and developing data products and APIs. Proficiency in SQL/NoSQL databases, particularly Snowflake, Redshift, or MongoDB, along with strong programming skills in Python, is essential. Additionally, experience with columnar OLAP databases, data modeling, and tools like dbt, AirFlow, Fivetran, GitHub, and Tableau reporting will be beneficial. Good communication and interpersonal skills are crucial for effectively collaborating with business stakeholders and translating requirements into actionable insights. An added advantage would be a good understanding of Salesforce & Netsuite systems, experience in SAAS environments, designing and deploying ML models, and familiarity with events and streaming data. Join us in driving data-driven solutions and experiences to shape the future of technology and innovation.,
Posted 6 days ago
6.0 - 9.0 years
8 - 11 Lacs
Hyderabad, Bengaluru, Delhi / NCR
Work from Office
We are seeking a Sr. Data Engineer to join our Data Engineering team within our Enterprise Data Insights organization to build data solutions, design and implement ETL/ELT processes and manage our data platform to enable our cross functional stakeholders. As a part of our Corporate Engineering division, our vision is to spearhead technology and data-led solutions and experiences to drive growth & innovation at scale. The ideal candidate will have a strong Data Engineering background, advanced Python knowledge and experience with cloud services and SQL/NoSQL databases. You will work closely with our cross functional stakeholders in Product, Finance and GTM along with Business and Enterprise Technology teams. As a Senior Data Engineer, you will: Collaborating closely with various stakeholders to prioritize requests, identify improvements, and offer recommendations. Taking the lead in analyzing, designing, and implementing data solutions, which involves constructing and designing data models and ETL processes. Cultivating collaboration with corporate engineering, product teams, and other engineering groups. Leading and mentoring engineering discussions, advocating for best practices. Actively participating in design and code reviews. Accessing and exploring third-party data APIs to determine the data required to meet business needs. Ensuring data quality and integrity across different sources and systems. Managing data pipelines for both analytics and operational purposes. Continuously enhancing processes and policies to improve SLA and SOX compliance. You'll be a great addition to the team if you have: Hold a B.S., M.S., or Ph.D. in Computer Science or a related technical field. Possess over 5 years of experience in Data Engineering, focusing on building and maintaining data environments. Demonstrate at least 5 years of experience in designing and constructing ETL/ELT processes, managing data solutions within an SLA-driven environment. Exhibit a strong background in developing data products, APIs, and maintaining testing, monitoring, isolation, and SLA processes. Possess advanced knowledge of SQL/NoSQL databases (such as Snowflake, Redshift, MongoDB). Proficient in programming with Python or other scripting languages. Have familiarity with columnar OLAP databases and data modeling. Experience in building ELT/ETL processes using tools like dbt, AirFlow, Fivetran, CI/CD using GitHub, and reporting in Tableau. Possess excellent communication and interpersonal skills to effectively collaborate with various business stakeholders and translate requirements. Added bonus if you also have: A good understanding of Salesforce & Netsuite systems Experience in SAAS environments Designed and deployed ML models Experience with events and streaming data Location: Remote- Bengaluru,Hyderabad,Delhi / NCR,Chennai,Pune,Kolkata,Ahmedabad,Mumbai
Posted 6 days ago
10.0 - 14.0 years
8 - 15 Lacs
Hyderabad, Bengaluru, Delhi / NCR
Hybrid
We are seeking a Sr. Data Engineer to join our Data Engineering team within our Enterprise Data Insights organization to build data solutions, design and implement ETL/ELT processes and manage our data platform to enable our cross functional stakeholders. As a part of our Corporate Engineering division, our vision is to spearhead technology and data-led solutions and experiences to drive growth & innovation at scale. The ideal candidate will have a strong Data Engineering background, advanced Python knowledge and experience with cloud services and SQL/NoSQL databases. You will work closely with our cross functional stakeholders in Product, Finance and GTM along with Business and Enterprise Technology teams. As a Senior Data Engineer, you will: Collaborating closely with various stakeholders to prioritize requests, identify improvements, and offer recommendations. Taking the lead in analyzing, designing, and implementing data solutions, which involves constructing and designing data models and ETL processes. Cultivating collaboration with corporate engineering, product teams, and other engineering groups. Leading and mentoring engineering discussions, advocating for best practices. Actively participating in design and code reviews. Accessing and exploring third-party data APIs to determine the data required to meet business needs. Ensuring data quality and integrity across different sources and systems. Managing data pipelines for both analytics and operational purposes. Continuously enhancing processes and policies to improve SLA and SOX compliance. You'll be a great addition to the team if you have: Hold a B.S., M.S., or Ph.D. in Computer Science or a related technical field. Possess over 5 years of experience in Data Engineering, focusing on building and maintaining data environments. Demonstrate at least 5 years of experience in designing and constructing ETL/ELT processes, managing data solutions within an SLA-driven environment. Exhibit a strong background in developing data products, APIs, and maintaining testing, monitoring, isolation, and SLA processes. Possess advanced knowledge of SQL/NoSQL databases (such as Snowflake, Redshift, MongoDB). Proficient in programming with Python or other scripting languages. Have familiarity with columnar OLAP databases and data modeling. Experience in building ELT/ETL processes using tools like dbt, AirFlow, Fivetran, CI/CD using GitHub, and reporting in Tableau. Possess excellent communication and interpersonal skills to effectively collaborate with various business stakeholders and translate requiremaob Title: Senior Software Engineer Full Stack Location: Remote- Bengaluru,Hyderabad,Delhi / NCR,Chennai,Pune,Kolkata,Ahmedabad,Mumbai Timings: 11 AM 8 PM IST
Posted 6 days ago
6.0 - 11.0 years
7 - 10 Lacs
Mumbai, Delhi / NCR, Bengaluru
Work from Office
We are seeking a Sr. Data Engineer to join our Data Engineering team within our Enterprise Data Insights organization to build data solutions, design and implement ETL/ELT processes and manage our data platform to enable our cross functional stakeholders. As a part of our Corporate Engineering division, our vision is to spearhead technology and data-led solutions and experiences to drive growth & innovation at scale. The ideal candidate will have a strong Data Engineering background, advanced Python knowledge and experience with cloud services and SQL/NoSQL databases. You will work closely with our cross functional stakeholders in Product, Finance and GTM along with Business and Enterprise Technology teams. As a Senior Data Engineer, you will: Collaborating closely with various stakeholders to prioritize requests, identify improvements, and offer recommendations. Taking the lead in analyzing, designing, and implementing data solutions, which involves constructing and designing data models and ETL processes. Cultivating collaboration with corporate engineering, product teams, and other engineering groups. Leading and mentoring engineering discussions, advocating for best practices. Actively participating in design and code reviews. Accessing and exploring third-party data APIs to determine the data required to meet business needs. Ensuring data quality and integrity across different sources and systems. Managing data pipelines for both analytics and operational purposes. Continuously enhancing processes and policies to improve SLA and SOX compliance. You'll be a great addition to the team if you have: Hold a B.S., M.S., or Ph.D. in Computer Science or a related technical field. Possess over 5 years of experience in Data Engineering, focusing on building and maintaining data environments. Demonstrate at least 5 years of experience in designing and constructing ETL/ELT processes, managing data solutions within an SLA-driven environment. Exhibit a strong background in developing data products, APIs, and maintaining testing, monitoring, isolation, and SLA processes. Possess advanced knowledge of SQL/NoSQL databases (such as Snowflake, Redshift, MongoDB). Proficient in programming with Python or other scripting languages. Have familiarity with columnar OLAP databases and data modeling. Experience in building ELT/ETL processes using tools like dbt, AirFlow, Fivetran, CI/CD using GitHub, and reporting in Tableau. Possess excellent communication and interpersonal skills to effectively collaborate with various business stakeholders and translate requirements. Added bonus if you also have: A good understanding of Salesforce & Netsuite systems Experience in SAAS environments Designed and deployed ML models Experience with events and streaming data Location - Mumbai, Delhi / NCR, Bengaluru , Kolkata, Chennai, Hyderabad, Ahmedabad, Pune, Remote
Posted 1 week ago
5.0 - 10.0 years
20 - 35 Lacs
Hyderabad, Pune
Work from Office
We are seeking a Sr. Data Engineer to join our Data Engineering team within our Enterprise Data Insights organization to build data solutions, design and implement ETL/ELT processes and manage our data platform to enable our cross functional stakeholders. As a part of our Corporate Engineering division, our vision is to spearhead technology and data-led solutions and experiences to drive growth & innovation at scale. The ideal candidate will have a strong Data Engineering background, advanced Python knowledge and experience with cloud services and SQL/NoSQL databases. You will work closely with our cross functional stakeholders in Product, Finance and GTM along with Business and Enterprise Technology teams. As a Senior Data Engineer, you will: Collaborating closely with various stakeholders to prioritize requests, identify improvements, and offer recommendations. Taking the lead in analyzing, designing, and implementing data solutions, which involves constructing and designing data models and ETL processes. Cultivating collaboration with corporate engineering, product teams, and other engineering groups. Leading and mentoring engineering discussions, advocating for best practices. Actively participating in design and code reviews. Accessing and exploring third-party data APIs to determine the data required to meet business needs. Ensuring data quality and integrity across different sources and systems. Managing data pipelines for both analytics and operational purposes. Continuously enhancing processes and policies to improve SLA and SOX compliance. You'll be a great addition to the team if you have: Hold a B.S., M.S., or Ph.D. in Computer Science or a related technical field. Possess over 5 years of experience in Data Engineering, focusing on building and maintaining data environments. Demonstrate at least 5 years of experience in designing and constructing ETL/ELT processes, managing data solutions within an SLA-driven environment. Exhibit a strong background in developing data products, APIs, and maintaining testing, monitoring, isolation, and SLA processes. Possess advanced knowledge of SQL/NoSQL databases (such as Snowflake, Redshift, MongoDB). Proficient in programming with Python or other scripting languages. Have familiarity with columnar OLAP databases and data modeling. Experience in building ELT/ETL processes using tools like dbt, AirFlow, Fivetran, CI/CD using GitHub, and reporting in Tableau. Possess excellent communication and interpersonal skills to effectively collaborate with various business stakeholders and translate requirements. Added bonus if you also have: A good understanding of Salesforce & Netsuite systems Experience in SAAS environments Designed and deployed ML models Experience with events and streaming data
Posted 1 week ago
2.0 - 7.0 years
10 - 20 Lacs
Bengaluru
Work from Office
Job Title: Data Engineer Dear Candidates, Greetings from ExxonMobil! Please copy and paste the below link into your browser to apply for the position in the company website. Link to apply: https://jobs.exxonmobil.com/job-invite/80614/ Please find the JD below, What role you will play in our team Design, build, and maintain data systems, architectures, and pipelines to extract insights and drive business decisions. Collaborate with stakeholders to ensure data quality, integrity, and availability. What you will do Support in developing and owning ETL pipelines within cloud data platforms Data extraction and transformation pipeline Automation using Python/Airflow/Azure Data Factory/Qlik/Fivetran Delivery of task monitoring and notification system for data pipeline status Supporting data cleansing, enrichment, and curation / enrichment activities to enable ongoing business use cases Developing and delivering data pipelines through a CI/CD delivery methodology Developing monitoring around pipelines to ensure uptime of data flows Optimization and refinement of current queries against Snowflake Working with Snowflake, MSSQL, Postgres, Oracle, Azure SQL, and other relational databases Work with different cloud databases such as Azure SQL, Azure PostgreSQL, Etc. Working with Change-Data-Capture ETL software to populate Snowflake such as Qlik and Fivetran Identification and remediation of failed and long running queries Development of large aggregate queries across a multitude of schemas About You Skills and Qualifications Experience with data processing / analytics, and ETL data transformation. Proficient in ingesting data to/from Snowflake, Azure storage account. Proficiency in at least one of the following languages: Python, C#, C++, F#, Java. Proficiency in SQL and NoSQL databases. Knowledge of SQL query development and optimization Demonstrated experience Snowflake, Qlik Replicate, Fivetran, Azure Data Explorer. Cloud azure experience (current/future) used (ADX, ADF, Databricks) Expertise with Airflow, Qlik, Fivetran, Azure Data Factory Management of Snowflake through DBT scripting Solid understanding of data strategies, including data management, data curation, and data governance Ability to quickly build relationships and credibility with business customers and agile teams A passion for learning about and experimenting with new technologies Confidence in creating and delivering technical presentations and training Excellent organization and planning skills Preferred Qualifications/ Experience Experience with data processing / analytics, and ETL data transformation. Proficient in ingesting data to/from Snowflake, Azure storage account. Proficiency in at least one of the following languages: Python, C#, C++, F#, Java. Proficiency in SQL and NoSQL databases. Knowledge of SQL query development and optimization Demonstrated experience Snowflake, Qlik Replicate, Fivetran, Azure Data Explorer. Cloud azure experience (current/future) used (ADX, ADF, Databricks) Expertise with Airflow, Qlik, Fivetran, Azure Data Factory Management of Snowflake through DBT scripting Solid understanding of data strategies, including data management, data curation, and data governance Ability to quickly build relationships and credibility with business customers and agile teams A passion for learning about and experimenting with new technologies Confidence in creating and delivering technical presentations and training Excellent organization and planning skills. Thanks & Regards, Anita
Posted 1 week ago
10.0 - 14.0 years
0 Lacs
vijayawada, andhra pradesh
On-site
As a Lead Data Engineer based in Vijayawada, Andhra Pradesh, you will be responsible for leveraging your extensive experience in data engineering and data architecture to design and develop end-to-end data solutions, data pipelines, and ETL processes. With a Bachelor's or Master's degree in Computer Science, Information Systems, or a related field, along with over 10 years of relevant experience, you will play a crucial role in ensuring the success of data projects. You will demonstrate your strong knowledge in data technologies such as Snowflake, Databricks, Apache Spark, Hadoop, Dbt, Fivetran, and Azure Data Factory. Your expertise in Python and SQL will be essential in tackling complex data challenges. Furthermore, your understanding of data governance, data quality, and data security principles will guide you in maintaining high standards of data management. In this role, your excellent problem-solving and analytical skills will be put to the test as you work both independently and collaboratively in an Agile environment. Your strong communication and leadership skills will be instrumental in managing projects, teams, and engaging in pre-sales activities. You will have the opportunity to showcase your technical leadership abilities by delivering solutions within defined timeframes and building strong client relationships. Moreover, your experience in complete project life cycle activities, agile methodologies, and working with globally distributed teams will be valuable assets in this position. Your proven track record of success in managing complex consulting projects and your ability to effectively communicate with technical and non-technical staff will contribute to the overall success of the team. If you are looking for a challenging role that combines technical expertise, leadership skills, and client engagement, this Lead Data Engineer position offers a dynamic opportunity to excel in a fast-paced and collaborative environment.,
Posted 1 week ago
3.0 - 7.0 years
0 Lacs
karnataka
On-site
As a Senior Auditor, Technology at LegalZoom, you will be an impactful member of the internal audit team, assisting in achieving the department's mission and objectives. Your role will involve evaluating technology risks in a dynamic environment, assessing the design and effectiveness of internal controls over financial reporting, and ensuring compliance with operational and regulatory requirements. You will document audit procedures and results following departmental standards and execute within agreed timelines. Additionally, you will provide advisory support to stakeholders on internal control considerations, collaborate with external auditors when necessary, and focus on continuous improvement of the audit department. Your commitment to integrity and ethics, coupled with a passion for the internal audit profession and LegalZoom's mission, are essential. Ideally, you hold a Bachelor's degree in computer science, information systems, or accounting, along with 3+ years of experience in IT internal audit and Sarbanes-Oxley compliance, particularly in the technology sector. Previous experience in a Big 4 accounting firm and internal audit at a public company would be advantageous. A professional certification such as CISA, CIA, CRISC, or CISSP is preferred. Strong communication skills, self-management abilities, and the capacity to work on multiple projects across different locations are crucial for this role. Familiarity with technologies like Oracle Cloud, AWS, Salesforce, Azure, and others is beneficial, along with reliable internet service for remote work. Join LegalZoom in making a difference and contributing to the future of accessible legal advice for all. LegalZoom is committed to diversity, equality, and inclusion, offering equal employment opportunities to all employees and applicants without discrimination based on any protected characteristic.,
Posted 1 week ago
5.0 - 9.0 years
0 Lacs
pune, maharashtra
On-site
You will be responsible for designing, developing, and maintaining dashboards and reports using Sigma Computing. Your main focus will be on collaborating with business stakeholders to understand data requirements and deliver actionable insights. It will be crucial for you to write and optimize SQL queries that run directly on cloud data warehouses. Additionally, enabling self-service analytics for business users via Sigma's spreadsheet interface and templates will be part of your responsibilities. You will need to apply row-level security and user-level filters to ensure proper data access controls. Furthermore, you will work closely with data engineering teams to validate data accuracy and ensure model alignment. Troubleshooting performance or data issues in reports and dashboards will also be a key aspect of your role. You will be expected to train and support users on Sigma best practices, tools, and data literacy. To excel in this role, you should have at least 5 years of experience in Business Intelligence, Analytics, or Data Visualization roles. Hands-on experience with Sigma Computing is highly preferred. Strong SQL skills and experience working with cloud data platforms such as Snowflake, BigQuery, or Redshift are essential. Familiarity with data modeling concepts and modern data stacks is required. Your ability to translate business requirements into technical solutions will be crucial. Knowledge of data governance, security, and role-based access controls is important. Excellent communication and stakeholder management skills are necessary for effective collaboration. Experience with tools like Looker, Tableau, Power BI, or similar ones will be beneficial for comparative insights. Familiarity with dbt, Fivetran, or other ELT/ETL tools is a plus. Exposure to Agile or Scrum methodologies would also be advantageous.,
Posted 1 week ago
3.0 - 7.0 years
0 Lacs
pune, maharashtra
On-site
You will be responsible for designing and implementing scalable Snowflake data warehouse architectures, which includes schema modeling and data partitioning. You will lead or support data migration projects from on-premise or legacy cloud platforms to Snowflake. Additionally, you will be developing ETL/ELT pipelines and integrating data using tools such as DBT, Fivetran, Informatica, Airflow, etc. It will be part of your role to define and implement best practices for data modeling, query optimization, and storage efficiency in Snowflake. Collaboration with cross-functional teams, including data engineers, analysts, BI developers, and stakeholders, to align architectural solutions will be essential. Ensuring data governance, compliance, and security by implementing RBAC, masking policies, and access control within Snowflake will also be a key responsibility. Working with DevOps teams to enable CI/CD pipelines, monitoring, and infrastructure as code for Snowflake environments will be part of your duties. Optimizing resource utilization, monitoring workloads, and managing the cost-effectiveness of the platform will also be under your purview. Staying updated with Snowflake features, cloud vendor offerings, and best practices is crucial. Qualifications & Skills: - Bachelor's or Master's degree in Computer Science, Information Systems, or a related field. - X years of experience in data engineering, data warehousing, or analytics architecture. - 3+ years of hands-on experience in Snowflake architecture, development, and administration. - Strong knowledge of cloud platforms (AWS, Azure, or GCP). - Solid understanding of SQL, data modeling, and data transformation principles. - Experience with ETL/ELT tools, orchestration frameworks, and data integration. - Familiarity with data privacy regulations (GDPR, HIPAA, etc.) and compliance. Qualifications: - Snowflake certification (SnowPro Core / Advanced). - Experience in building data lakes, data mesh architectures, or streaming data platforms. - Familiarity with tools like Power BI, Tableau, or Looker for downstream analytics. - Experience with Agile delivery models and CI/CD workflows.,
Posted 2 weeks ago
3.0 - 7.0 years
0 Lacs
pune, maharashtra
On-site
As a Data Engineer specializing in Snowflake architecture, you will be responsible for designing and implementing scalable data warehouse architectures, including schema modeling and data partitioning. Your role will involve leading or supporting data migration projects to Snowflake from on-premise or legacy cloud platforms. You will be developing ETL/ELT pipelines and integrating data using various tools such as DBT, Fivetran, Informatica, and Airflow. It will be essential to define and implement best practices for data modeling, query optimization, and storage efficiency within Snowflake. Collaboration with cross-functional teams, including data engineers, analysts, BI developers, and stakeholders, will be crucial to align architectural solutions effectively. Ensuring data governance, compliance, and security by implementing RBAC, masking policies, and access control within Snowflake will also be part of your responsibilities. Working closely with DevOps teams to enable CI/CD pipelines, monitoring, and infrastructure as code for Snowflake environments is essential. Your role will involve optimizing resource utilization, monitoring workloads, and managing the cost-effectiveness of the platform. Staying updated with Snowflake features, cloud vendor offerings, and best practices will be necessary to drive continuous improvement in data architecture. Qualifications & Skills: - Bachelor's or Master's degree in Computer Science, Information Systems, or a related field. - 5+ years of experience in data engineering, data warehousing, or analytics architecture. - 3+ years of hands-on experience in Snowflake architecture, development, and administration. - Strong knowledge of cloud platforms such as AWS, Azure, or GCP. - Solid understanding of SQL, data modeling, and data transformation principles. - Experience with ETL/ELT tools, orchestration frameworks, and data integration. - Familiarity with data privacy regulations (GDPR, HIPAA, etc.) and compliance. Additional Qualifications: - Snowflake certification (SnowPro Core / Advanced). - Experience in building data lakes, data mesh architectures, or streaming data platforms. - Familiarity with tools like Power BI, Tableau, or Looker for downstream analytics. - Experience with Agile delivery models and CI/CD workflows. This role offers an exciting opportunity to work on cutting-edge data architecture projects and collaborate with diverse teams to drive impactful business outcomes.,
Posted 2 weeks ago
10.0 - 14.0 years
0 Lacs
punjab
On-site
About Us We are a global climate technologies company engineered for sustainability. We create sustainable and efficient residential, commercial and industrial spaces through HVACR technologies. We protect temperature-sensitive goods throughout the cold chain. And we bring comfort to people globally. Best-in-class engineering, design and manufacturing combined with category-leading brands in compression, controls, software and monitoring solutions result in next-generation climate technology that is built for the needs of the world ahead. Whether you are a professional looking for a career change, an undergraduate student exploring your first opportunity, or recent graduate with an advanced degree, we have opportunities that will allow you to innovate, be challenged and make an impact. Join our team and start your journey today! JOB DESCRIPTION/ RESPONSIBILITIES: We are looking for Data Engineers:- Candidate must have a minimum of (10) years of experience in a Data Engineer role, including the following tools/technologies: Experience with relational (SQL) databases. Experience with data warehouses like Oracle, SQL & Snowflake. Technical expertise in data modeling, data mining and segmentation techniques. Experience with building new and troubleshooting existing data pipelines using tools like Pentaho Data Integration (PDI),ADF (Azure Data Factory),Snowpipe,Fivetran,DBT Experience with batch and real time data ingestion and processing frameworks. Experience with languages like Python, Java, etc. Knowledge of additional cloud-based analytics solutions, along with Kafka, Spark and Scala is a plus. * Develops code and solutions that transfers/transforms data across various systems * Maintains deep technical knowledge of various tools in the data warehouse, data hub, * and analytical tools. * Ensures data is transformed and stored in efficient methods for retrieval and use. * Maintains data systems to ensure optimal performance * Develops a deep understanding of underlying business systems involved with analytical systems. * Follows standard software development lifecycle, code control, code standards and * process standards. * Maintains and develops technical knowledge by self-training of current toolsets and * computing environments, participates in educational opportunities, maintains professional * networks, and participates in professional organizations related to their tech skills Systems Analysis * Works with key stakeholders to understand business needs and capture functional and * technical requirements. * Offers ideas that simplify the design and complexity of solutions delivered. * Effectively communicates any expectations required of stakeholders or other resources * during solution delivery. * Develops and executes test plans to ensure successful rollout of solution, including accuracy * and quality of data. Service Management * Effectively communicates to leaders and stakeholders any obstacles that occur during * solution delivery. * Defines and manages promised delivery dates. * Pro-actively research, analyzes, and predicts operational issues, informing leadership * where appropriate. * Offers viable options to solve unexpected/unknown issues that occur during * solution development and delivery. * EDUCATION / JOB-RELATED TECHNICAL SKILLS: * Bachelors Degree in Computer Science/Information Technology or equivalent * Ability to effectively communicate with others at all levels of the Company both verbally and in * writing. Demonstrates a courteous, tactful, and professional approach with employees and * others. * Ability to work in a large, global corporate structure Our Commitment to Our People Across the globe, we are united by a singular Purpose: Sustainability is no small ambition. Thats why everything we do is geared toward a sustainable futurefor our generation and all those to come. Through groundbreaking innovations, HVACR technology and cold chain solutions, we are reducing carbon emissions and improving energy efficiency in spaces of all sizes, from residential to commercial to industrial. Our employees are our greatest strength. We believe that our culture of passion, openness, and collaboration empowers us to work toward the same goal - to make the world a better place. We invest in the end-to-end development of our people, beginning at onboarding and through senior leadership, so they can thrive personally and professionally. Flexible and competitive benefits plans offer the right options to meet your individual/family needs . We provide employees with flexible time off plans, including paid parental leave (maternal and paternal), vacation and holiday leave. Together, we have the opportunity and the power to continue to revolutionize the technology behind air conditioning, heating and refrigeration, and cultivate a better future. Learn more about us and how you can join our team! Our Commitment to Diversity, Equity & Inclusion At Copeland, we believe having a diverse, equitable and inclusive environment is critical to our success. We are committed to creating a culture where every employee feels welcomed, heard, respected, and valued for their experiences, ideas, perspectives and expertise . Ultimately, our diverse and inclusive culture is the key to driving industry-leading innovation, better serving our customers and making a positive impact in the communities where we live. Equal Opportunity Employer ,
Posted 3 weeks ago
6.0 - 11.0 years
7 - 17 Lacs
Gurugram
Work from Office
We are heavily dependent on BigQuery/Snowflake, Airflow, Stitch/Fivetran, dbt , Tableau/Looker for our business intelligence and embrace AWS with some GCP. As a Data Engineer,Developing end to end ETL/ELT Pipeline.
Posted 3 weeks ago
10.0 - 20.0 years
25 - 30 Lacs
Bengaluru
Remote
Role & responsibilities Data Platform : Snowflake, dbt, Fivetran, Oracle OCI Visualization : Tableau Cloud & Identity : Azure, Microsoft Entra (Entra ID / Azure AD) Infrastructure as Code : OpenTofu (Terraform alternative) - Migration from Terraform Scripting & Monitoring : SQL, Python/Bash, monitoring tools
Posted 3 weeks ago
5.0 - 9.0 years
15 - 19 Lacs
Chennai
Work from Office
Senior Data Engineer - Azure Years of Experience : 5 Job location: Chennai Job Description : We are looking for a skilled and experienced Senior Azure Developer to join the team! As part of the team, you will be involved in the implementation of the ongoing and new initiatives for our company. If you love learning, thinking strategically, innovating,and helping others, this job is for you! Primary Skills : ADF,Databricks Secondary Skills : DBT,Python,Databricks,Airflow,Fivetran,Glue,Snowflake Role Description : Data engineering role requires creating and managing technological infrastructure of a data platform, be in-charge / involved in architecting, building, and managing data flows / pipelines and construct data storages (noSQL, SQL), tools to work with big data (Hadoop, Kafka), and integration tools to connect sources or other databases. Role Responsibility : l Translate functional specifications and change requests into technical specifications l Translate business requirement document, functional specification, and technical specification to related coding l Develop efficient code with unit testing and code documentation l Ensuring accuracy and integrity of data and applications through analysis, coding, documenting, testing, and problem solving l Setting up the development environment and configuration of the development tools l Communicate with all the project stakeholders on the project status l Manage, monitor, and ensure the security and privacy of data to satisfy business needs l Contribute to the automation of modules, wherever required l To be proficient in written, verbal and presentation communication (English) l Co-ordinating with the UAT team Role Requirement : l Proficient in basic and advanced SQL programming concepts (Procedures, Analytical functions etc.) l Good Knowledge and Understanding of Data warehouse concepts (Dimensional Modeling, change data capture, slowly changing dimensions etc.) l Knowledgeable in Shell / PowerShell scripting l Knowledgeable in relational databases, nonrelational databases, data streams, and file stores l Knowledgeable in performance tuning and optimization l Experience in Data Profiling and Data validation l Experience in requirements gathering and documentation processes and performing unit testing l Understanding and Implementing QA and various testing process in the project l Knowledge in any BI tools will be an added advantage l Sound aptitude, outstanding logical reasoning, and analytical skills l Willingness to learn and take initiatives l Ability to adapt to fast-paced Agile environment Additional Requirement : l Demonstrated expertise as a Data Engineer, specializing in Azure cloud services. l Highly skilled in Azure Data Factory, Azure Data Lake, Azure Databricks, and Azure Synapse Analytics. l Create and execute efficient, scalable, and dependable data pipelines utilizing Azure Data Factory. l Utilize Azure Databricks for data transformation and processing. l Effectively oversee and enhance data storage solutions, emphasizing Azure Data Lake and other Azure storage services. l Construct and uphold workflows for data orchestration and scheduling using Azure Data Factory or equivalent tools. l Proficient in programming languages like Python, SQL, and conversant with pertinent l scripting languages.
Posted 4 weeks ago
Upload Resume
Drag or click to upload
Your data is secure with us, protected by advanced encryption.
Browse through a variety of job opportunities tailored to your skills and preferences. Filter by location, experience, salary, and more to find your perfect fit.
We have sent an OTP to your contact. Please enter it below to verify.
Accenture
39581 Jobs | Dublin
Wipro
19070 Jobs | Bengaluru
Accenture in India
14409 Jobs | Dublin 2
EY
14248 Jobs | London
Uplers
10536 Jobs | Ahmedabad
Amazon
10262 Jobs | Seattle,WA
IBM
9120 Jobs | Armonk
Oracle
8925 Jobs | Redwood City
Capgemini
7500 Jobs | Paris,France
Virtusa
7132 Jobs | Southborough