Get alerts for new jobs matching your selected skills, preferred locations, and experience range. Manage Job Alerts
7.0 - 12.0 years
7 - 11 Lacs
Delhi, India
On-site
Key deliverables: Enhance and maintain the MDM platform to support business needs Develop data pipelines using Snowflake, Python, SQL, and orchestration tools like Airflow Monitor and improve system performance and troubleshoot data pipeline issues Resolve production issues and ensure platform reliability Role responsibilities: Collaborate with data engineering and analytics teams for scalable solutions Apply DevOps practices to streamline deployment and automation Integrate cloud-native tools and services (AWS, Azure) with the data platform Utilize dbt and version control (Git) for data transformation and management
Posted 2 months ago
7.0 - 12.0 years
7 - 11 Lacs
Pune, Maharashtra, India
On-site
Key deliverables: Enhance and maintain the MDM platform to support business needs Develop data pipelines using Snowflake, Python, SQL, and orchestration tools like Airflow Monitor and improve system performance and troubleshoot data pipeline issues Resolve production issues and ensure platform reliability Role responsibilities: Collaborate with data engineering and analytics teams for scalable solutions Apply DevOps practices to streamline deployment and automation Integrate cloud-native tools and services (AWS, Azure) with the data platform Utilize dbt and version control (Git) for data transformation and management
Posted 2 months ago
9.0 - 13.0 years
25 - 35 Lacs
Hyderabad
Hybrid
Senior Data Engineer You are familiar with AWS and Azure Cloud. You have extensive knowledge of Snowflake , SnowPro Core certification is a must have. You have used DBT at least in one project to deploy models in production. You have configured and deployed Airflow and integrated various operator in airflow (especially DBT & Snowflake). You can design build, release pipelines and understand of Azure DevOps Ecosystem. You have excellent understanding of Python (especially PySpark) and able to write metadata driven programs. Familiar with Data Vault (Raw , Business) also concepts like Point In Time , Semantic Layer. You are resilient in ambiguous situations and can clearly articulate the problem in a business friendly way. You believe in documenting processes and managing the artifacts and evolving that over time.
Posted 2 months ago
8.0 - 12.0 years
35 - 50 Lacs
Chennai
Remote
modern data warehouse (Snowflake, Big Query, Redshift) and graph databases. designing and building efficient data pipelines for the ingestion and transformation of data into a data warehouse Proficiency in Python, dbt, git, SQL, AWS and Snowflake.
Posted 2 months ago
5.0 - 10.0 years
15 - 22 Lacs
Hyderabad
Work from Office
data engineering, data management, Snowflake , SQL, data modeling, and cloud-native data architecture AWS, Azure, or Google Cloud (Snowflake on cloud platforms) ETL tools such as Informatica, Talend, or dbt Python or Shell scripting
Posted 2 months ago
7.0 - 11.0 years
20 - 35 Lacs
Gandhinagar, Ahmedabad
Hybrid
Job Title: Senior Data Engineer Experience: 8 to 10 Years Location: Ahmedabad & Gandhinagar Employment Type: Full-time Our client is a leading provider of advanced solutions for capital markets, specializing in cutting-edge trading infrastructure and software. With a global presence and a strong focus on innovation, the company empowers professional traders, brokers, and financial institutions to execute high-speed, high-performance trading strategies across multiple asset classes. Their technology is known for its reliability, low latency, and scalability, making it a preferred choice for firms seeking a competitive edge in dynamic financial environments. Role & responsibilities Design, develop, and maintain scalable and reliable data pipelines using DBT and Airflow . Work extensively with Snowflake to optimize data storage, transformation, and access. Develop and maintain efficient ETL/ELT processes in Python to support analytical and operational workloads. Ensure high standards of data quality, consistency, and security across systems. Collaborate with cross-functional teams to understand data requirements and translate them into technical solutions. Monitor and troubleshoot data pipelines, resolving issues proactively. Optimize performance of existing data workflows and recommend improvements. Document data engineering processes and solutions effectively. Preferred candidate profile Bachelors or Masters degree in Computer Science, Engineering, or related field 8 - 10 years of experience in data engineering or related roles Strong knowledge of SQL and data warehousing principles Familiarity with version control (e.g., Git) and CI/CD practices Excellent problem-solving skills and attention to detail Strong communication and collaboration abilities Preferred Skills Experience in cloud platforms like AWS, GCP, or Azure Exposure to data governance and security best practices Knowledge of modern data architecture and real-time processing frameworks Competitive Benefits Offered By Our Client: Relocation Support: Our client offers an additional relocation allowance to assist with moving expenses. Comprehensive Health Benefits: Including medical, dental, and vision coverage. Flexible Work Schedule: Hybrid work model with an expectation of just 2 days on-site per week. Generous Paid Time Off (PTO): 21 days per year, with the ability to roll over 1 day into the following year. Additionally, 1 day per year is allocated for volunteering, 2 training days per year for uninterrupted professional development, and 1 extra PTO day during milestone years. Paid Holidays & Early Dismissals: A robust paid holiday schedule with early dismissal on select days, plus generous parental leave for all genders, including adoptive parents. Tech Resources: A rent-to-own program offering employees a company-provided Mac/PC laptop and/or mobile phone of their choice, along with a tech accessories budget for monitors, headphones, keyboards, and other office equipment. Health & Wellness Subsidies: Contributions toward gym memberships and health/wellness initiatives to support your well-being. Milestone Anniversary Bonuses: Special bonuses to celebrate key career milestones. Inclusive & Collaborative Culture: A forward-thinking, culture-based organisation that values diversity and inclusion and fosters collaborative teams.
Posted 2 months ago
4.0 - 7.0 years
20 - 35 Lacs
Noida
Remote
Job posting Title: Senior Data Engineer with Looker experience Relevant Experience: 5 to 10 years Key Skills: Looker Dashboard, LookML, DBT, GCP and Bigquery Location: Remote/Work from Home Note: We need to fill this position on priority, hence candidates with shorter notice period/Immediate joining will be given preference. Embark on an exciting journey into the realm of software development with 3Pillar! We extend an invitation for you to join our team and gear up for a thrilling adventure. At 3Pillar, our focus is on leveraging cutting-edge technologies that revolutionize industries by enabling data driven decision making. As a Data Engineer, you will hold a crucial position within our dynamic team, actively contributing to thrilling projects that reshape data analytics for our clients, providing them with a competitive advantage in their respective industries. If your passion for data analytics solutions that make a real-world impact, consider this your pass to the captivating world of Data Science and Engineering! Key Responsibilities Understanding the business requirements Design, develop, and maintain data pipelines using dbt to transform and load data into a data warehouse (e.g., Snowflake, BigQuery). Create and manage data models within dbt to ensure data consistency and accuracy for analysis. Build and maintain Looker dashboards and explore, translating complex data into intuitive visualizations for business users. Collaborate with business stakeholders to understand data requirements and translate them into effective data visualizations within Looker. Monitor data quality and implement data quality checks to ensure data integrity. Optimize data pipelines for performance and scalability. Troubleshoot data issues and resolve data quality discrepancies. Implement data governance practices to ensure data security and compliance. Minimum Qualifications: Demonstrated expertise with a minimum of 5+ years of experience as data engineer or similar role Advanced SQL skills and experience with relational and non-relational databases Deep understanding of data warehousing concepts Extensive experience with dbt (data build tool) Expertise in Looker dashboard development and administration Should have worked on Google Cloud services like Big Query, DataFlow and Storage Strong Python/Pyspark skills with hands-on experience Strong exposure to write ETL Proven track record of designing and implementing data pipelines using dbt Experience building and maintaining Looker dashboards and explores Familiarity with data governance practices and data security protocols Experience in Data collection from multiple sources and integrating based on business needs If Interested, please share the resume at Kiran.Dhanak@3pillarglobal.com or call on 96533-53608. Regards Kiran Dhanak TA Team 3Pillar
Posted 2 months ago
3.0 - 8.0 years
5 - 10 Lacs
Chennai
Hybrid
Duration: 8Months Work Type: Onsite Position Description: Looking for qualified Data Scientists who can develop scalable solutions to complex real-world problems using Machine Learning, Big Data, Statistics, and Optimization. Potential candidates should have hands-on experience in applying first principles methods, machine learning, data mining, and text mining techniques to build analytics prototypes that work on massive datasets. Candidates should have experience in manipulating both structured and unstructured data in various formats, sizes, and storage-mechanisms. Candidates should have excellent problem-solving skills with an inquisitive mind to challenge existing practices. Candidates should have exposure to multiple programming languages and analytical tools and be flexible to using the requisite tools/languages for the problem at-hand. Skills Required: Machine Learning, GenAI, LLM Skills Preferred: Python, Google Cloud Platform, Big Query Experience Required: 3+ years of hands-on experience in using machine learning/text mining tools and techniques such as Clustering/classification/decision trees, Random forests, Support vector machines, Deep Learning, Neural networks, Reinforcement learning, and other numerical algorithms Experience Preferred: 3+ years of experience in at least one of the following languages: Python, R, MATLAB, SAS Experience with GoogleCloud Platform (GCP) including VertexAI, BigQuery, DBT, NoSQL database and Hadoop Ecosystem Education Required: Bachelor's Degree
Posted 2 months ago
5.0 - 9.0 years
7 - 11 Lacs
Bengaluru
Work from Office
We are looking for a Senior Data Engineer who will design, build, and maintain scalable data pipelines and ingestion frameworks. The ideal candidate must have experience with DBT, orchestration tools like Airflow or Prefect, and cloud platforms such as AWS. Responsibilities include developing ELT pipelines, optimizing queries, implementing CI/CD, and integrating with AWS services. Strong SQL, Python, and data modeling skills are essential. The role also involves working with real-time and batch processing, ensuring high performance and data integrity.
Posted 2 months ago
6.0 - 9.0 years
15 - 30 Lacs
Noida, Pune, Chennai
Work from Office
Job Description candidate Must Have Experience: SQL Dbt Snowflake Data Quality & Data modelling Snowpipe, Fivetran Candidates can share resumes at deepali.rawat@rsystems.com Roles & Responsibilities 1. Ensure reliable and scalable data pipelines to support healthcare operations. 2. Maintain data availability with proactive exception handling and recovery mechanisms. 3. Perform data quality checks to ensure accuracy, completeness, and consistency. 4. Detect and handle alerts early to prevent data discrepancies and processing failures. 5. Develop and optimize data models for efficient storage, retrieval, and analytics. 6. Prepare and structure data for reporting, compliance, and decision-making. 7. Work with Snowflake to manage data warehousing and performance tuning. 8. Implement and optimize DBT workflows for data transformation and governance. 9. Leverage SQL and Python for data processing, automation, and validation. 10. Experience with Snowpipe and Fivetran is a plus for automating data ingestion.
Posted 2 months ago
8.0 - 9.0 years
8 - 9 Lacs
Gurgaon / Gurugram, Haryana, India
On-site
Overview: We are looking for a skilled Snowflake Developer with 8+ years of experience in developing and managing data warehouse solutions using Snowflake. The ideal candidate should have expertise in stored procedures, SQL scripting, and DBT development using models, macros, and jobs. The candidate should also have a strong understanding of DWH concepts, along with experience in developing ETL solutions and implementing CICD pipelines using Bitbucket, Jenkins, DBT, and Snowflake. Additionally, the candidate should have experience in collaborating with stakeholders to gather requirements, develop logic, and deploy solutions. In This Role, You Will: Manage and maintain the Snowflake platform, ensuring optimal performance and uptime. Design and implement Snowflake architecture, considering best practices for scalability, security, and compliance. Conduct performance optimization activities to ensure efficient use of resources and credits. Oversee governance and compliance practices, enabling the right audit logs and ensuring data security using RBAC, masking etc. Perform POCs to evaluate new features and functionalities. Enable and configure new features on the Snowflake platform. Develop and implement integration design strategies using AWS services such as S3, Lambda, SQS, and Kinesis. Design and implement API-based integrations to ensure seamless data flow between systems. Collaborate with cross-functional teams to ensure the successful implementation of Snowflake projects. Utilize programming languages, particularly Python, to develop custom solutions and automation scripts. Heres What You Need: Proven experience working with Snowflake and AWS cloud platforms. In-depth knowledge of Snowflake architecture, design, and best practices. Strong understanding of compliance and governance practices, with the ability to enable and manage audit logs. Expertise in performance optimization and credit usage management on the Snowflake platform. Experience with AWS services such as S3, Lambda, SQS, and Kinesis. Proficient in API-based integrations and data integration strategies. Strong programming skills, particularly in Python. Excellent collaboration and communication skills, with the ability to work effectively with cross-functional teams. Experience: 8 - 9 years Salary: Not Disclosed Location: Gurugram
Posted 2 months ago
11.0 - 12.0 years
11 - 12 Lacs
Gurgaon / Gurugram, Haryana, India
On-site
Overview: We are looking for a skilled Snowflake Developer with 8+ years of experience in developing and managing data warehouse solutions using Snowflake. The ideal candidate should have expertise in stored procedures, SQL scripting, and DBT development using models, macros, and jobs. The candidate should also have a strong understanding of DWH concepts, along with experience in developing ETL solutions and implementing CICD pipelines using Bitbucket, Jenkins, DBT, and Snowflake. Additionally, the candidate should have experience in collaborating with stakeholders to gather requirements, develop logic, and deploy solutions. In This Role, You Will: Manage and maintain the Snowflake platform, ensuring optimal performance and uptime. Design and implement Snowflake architecture, considering best practices for scalability, security, and compliance. Conduct performance optimization activities to ensure efficient use of resources and credits. Oversee governance and compliance practices, enabling the right audit logs and ensuring data security using RBAC, masking etc. Perform POCs to evaluate new features and functionalities. Enable and configure new features on the Snowflake platform. Develop and implement integration design strategies using AWS services such as S3, Lambda, SQS, and Kinesis. Design and implement API-based integrations to ensure seamless data flow between systems. Collaborate with cross-functional teams to ensure the successful implementation of Snowflake projects. Utilize programming languages, particularly Python, to develop custom solutions and automation scripts. Heres What You Need: Proven experience working with Snowflake and AWS cloud platforms. In-depth knowledge of Snowflake architecture, design, and best practices. Strong understanding of compliance and governance practices, with the ability to enable and manage audit logs. Expertise in performance optimization and credit usage management on the Snowflake platform. Experience with AWS services such as S3, Lambda, SQS, and Kinesis. Proficient in API-based integrations and data integration strategies. Strong programming skills, particularly in Python. Excellent collaboration and communication skills, with the ability to work effectively with cross-functional teams.
Posted 2 months ago
5 - 10 years
0 - 3 Lacs
Hyderabad
Hybrid
Job Profile We are seeking a Senior Data Engineer with proven expertise in designing and maintaining scalable, efficient, and reliable data pipelines. The ideal candidate should have strong proficiency in SQL, DBT, BigQuery, Python, and Airflow, along with a solid foundation in data warehousing principles. In this role, you will be instrumental in managing and optimizing data workflows, ensuring high data quality, and supporting data-driven decision-making across the organization. Experience with Oracle ERP systems and knowledge of data migration to a data warehouse environment will be considered a valuable advantage. Years of Experience: 5 to 10 Years. Shift Timings: 1PM to 10PM IST. Skill Set • SQL: Advanced proficiency in writing optimized queries, working with complex joins, CTEs, window functions, etc. • DBT (Data Build Tool): Experience in modelling data with dbt, managing data transformations, and maintaining project structure. Python: Proficient in writing data processing scripts and building Airflow DAGs using Python. BigQuery: Strong experience with GCPs BigQuery, including dataset optimization, partitioning, and query cost management. Apache Airflow: Experience building and managing DAGs, handling dependencies, scheduling jobs, and error handling. Data Warehousing Concepts: Strong grasp of ETL/ELT, dimensional modelling (star/snowflake), fact/dimension tables, slowly changing dimensions, etc. Version Control: Familiarity with Git/GitHub for code collaboration and deployment. • Cloud Platforms: Working knowledge of Google Cloud Platform (GCP). Job Description Roles & Responsibilities: Design, build, and maintain robust ETL/ELT data pipelines using Python, Airflow, and DBT. Develop and manage dbt models to enable efficient, reusable, and well-documented data transformations. Collaborate with stakeholders to gather data requirements and design data marts comprising fact and dimension tables in a well-structured star schema. Manage and optimize data models and transformation logic in BigQuery, ensuring high performance and cost-efficiency. Implement and uphold robust data quality checks, logging, and alerting mechanisms to ensure reliable data delivery. Maintain the BigQuery data warehouse, including routine optimizations and updates. Enhance and support the data warehouse architecture, including the use of star/snowflake schemas, partitioning strategies, and data mart structures. Proactively monitor and troubleshoot production pipelines to minimize downtime and ensure data accuracy.
Posted 2 months ago
15 - 19 years
15 - 30 Lacs
Noida, Chennai, Bengaluru
Hybrid
Job Description- Experience - 12 Years 16 Years Primary Skill - Delivery management with Data Warehouse background Notice Period - Immediate to 30 Days Work Location - Chennai, Noida & Bangalore Role & Responsibility- Required Skills 12+ Years of experience in managing delivery of Data Warehouse Projects (Development & Modernization/Migration). Strong Delivery background with experience in managing large complex Data Warehouse engagements. Good to have experience on Snowflake, Matillion, DBT, Netezza/DataStage and Oracle. Healthcare Payer Industry experience Extensive experience in Program/Project Management, Iterative, Waterfall and Agile Methodologies. Ability to track and manage complex program budgets Experience in managing the delivery of complex programs to meet the needs and the required timelines set for the defined programs. Communicate program review results to various stakeholders. Experience in building the team, providing guidance, and education as needed to ensure the success of priority programs and promote cross-training within the department. Experience in developing and managing an integrated program plans that incorporate both technical and business deliverables. Verify that critical decision gates are well defined, communicated and monitored for executive approval throughout the program. Verify that work supports the corporate strategic direction. Review resulting vendor proposals and estimates to ensure they satisfy both our functional requirements and technology strategies. Project management methodologies, processes, and tools. Knowledge of Project Development Life Cycle Establish and maintain strong working relationships with various stakeholders including team members, IT resources, resources in other areas of the business and upper management Ability to track and manage complex program budgets Strong business acumen and political savvy Ability to collaborate while dealing with complex situations Ability to think creatively and to drive innovation Ability to motivate, lead and inspire a diverse group to a common goal/solution with multiple stakeholders Ability to convert business strategy into action oriented objectives and measurable results Strong negotiating, influencing, and consensus-building skills Ability to mentor, coach and provide guidance to others Responsibilities: Responsible for the end to end delivery of the Application Development and Support services for the client Coordinate with Enterprise Program Management Office to execute programs following defined standards and governance structure to ensure alignment to the approved project development life cycle (PDLC). Interface regularly with key senior business leaders to enable a smooth transition from strategy development to program identification and execution. Facilitate meetings with task groups or functional areas as required for EPMO supported initiatives and/or to resolve issues. Proactively engage other members of the organization with specific subject knowledge to resolve issues or provide assistance. Lead post implementation review of major initiatives to provide lessons learned and continuous improvement. Develop accurate and timely summary report for executive management that provide consolidated, clear, and concise assessments of strategic initiatives implementation status. Collaborate with business owners to develop divisional business plans that support the overall strategic direction. Supports budget allocation process through ongoing financial tracking reports. Develop & maintain service plans considering the customer requirements. Track and monitor to ensure the adherence to SLA/KPIs Identify opportunities for improvement to service delivery process. Address service delivery issues/escalations/complaints. First point of escalation for customer escalations Oversee shift management for various tracks. Responsible for publishing production support reports & metrics
Posted 2 months ago
6 - 10 years
10 - 20 Lacs
Hyderabad
Work from Office
We're looking for a Data Engineer to join our team. We need someone who's great at building data pipelines and understands how data works. You'll be using tools like DBT and Snowflake a lot. The most important thing for us is that you've worked with all sorts of data sources , not just files. Think different cloud systems, other company databases, and various online tools. What you'll do: Build and manage how data flows into our system using DBT and storing it in Snowflake . Design how our data is organized so it's easy to use for reports and analysis. Fix any data problems that come up. Connect to and get data from many different places , like: Cloud apps (e.g., Salesforce, marketing tools) Various databases (SQL Server, Oracle, etc.) Streaming data Different file types (CSV, JSON, etc.) Other business systems Help us improve our data setup. What you need: Experience as a Data Engineer . Strong skills with DBT (Data Build Tool). Solid experience with Snowflake . Must have experience working with many different types of data sources, especially cloud systems and other company databases not just files. Good at data modeling (organizing data). Comfortable with SQL . Good at solving problems
Posted 2 months ago
6 - 11 years
12 - 22 Lacs
Gurugram, Bengaluru
Work from Office
Data Engineer As part of the team, you will be responsible for building and running the data pipelines and services that are required to support business functions/reports/dashboard.. We are heavily dependent on BigQuery/Snowflake, Airflow, Stitch/Fivetran, dbt , Tableau/Looker for our business intelligence and embrace AWS with some GCP.As a Data Engineer youll be: Developing end to end ETL/ELT Pipeline working with Data Analysts of business Function. Designing, developing, and implementing scalable, automated processes for data extraction, processing, and analysis in a Data Mesh architecture Mentoring Fother Junior Engineers in the Team Be a go-to expert for data technologies and solutions Ability to provide on the ground troubleshooting and diagnosis to architecture and design challenges Troubleshooting and resolving technical issues as they arise Looking for ways of improving both what and how data pipelines are delivered by the department Translating business requirements into technical requirements, such as entities that need to be modelled, DBT models that need to be build, timings, tests and reports Owning the delivery of data models and reports end to end Perform exploratory data analysis in order to identify data quality issues early in the process and implement tests to ensure prevent them in the future Working with Data Analysts to ensure that all data feeds are optimised and available at the required times. This can include Change Capture, Change Data Control and other “delta loading” approaches Discovering, transforming, testing, deploying and documenting data sources Applying, help defining, and championing data warehouse governance: data quality, testing, coding best practises, and peer review Building Looker Dashboard for use cases if required What makes you a great fit: Having 3+ years of extensive development experience using snowflake or similar data warehouse technology Having working experience with dbt and other technologies of the modern datastack, such as Snowflake, Apache Airflow, Fivetran, AWS, git, Looker Experience in agile processes, such as SCRUM Extensive experience in writing advanced SQL statements and performance tuning them Experience in Data Ingestion techniques using custom or SAAS tool like fivetran Experience in data modelling and can optimise existing/new data models Experience in data mining, data warehouse solutions, and ETL, and using databases in a business environment with large-scale, complex datasets
Posted 2 months ago
8 - 11 years
14 - 19 Lacs
Thiruvananthapuram
Work from Office
We are looking for a skilled professional with 8 to 11 years of industry experience to lead our migration of data analytics environment from Teradata to Snowflake, focusing on performance and reliability. The ideal candidate will have strong technical expertise in big data engineering and hands-on experience with Snowflake. ### Roles and Responsibility Lead the migration of data analytics environments from Teradata to Snowflake, emphasizing performance and reliability. Design and deploy big data pipelines in a cloud environment using Snowflake Cloud DW. Develop and migrate existing on-prem ETL routines to Cloud Services. Collaborate with senior leaders to understand business goals and contribute to workstream delivery. Design and optimize model codes for faster execution. Work with cross-functional teams to ensure seamless integration of data analytics solutions. ### Job Requirements Minimum 8 years of experience as an Architect on Analytics solutions. Strong technical experience with Snowflake, including modeling, schema, and database design. Experience integrating with third-party tools, ETL, and DBT tools. Proficiency in programming languages such as Java, Scala, or Python. Excellent communication skills, both written and verbal, with the ability to communicate complex technical concepts effectively. Flexible and proactive working style with strong personal ownership of problem resolution. A computer science graduate or equivalent is required.
Posted 2 months ago
5 - 10 years
12 - 16 Lacs
Pune, Chennai, Bengaluru
Work from Office
Role & responsibilities .Mastery of SQL, especially within cloud-based data warehouses like Snowflake. Experience on Snowflake with data architecture, design, analytics and Development. 2.Detailed knowledge and hands-on working experience in Snowpipe/ SnowProc/ SnowSql. 3.Technical lead with strong development background having 2-3 years of rich hands-on development experience in snowflake. 4.Experience designing highly scalable ETL/ELT processes with complex data transformations, data formats including error handling and monitoring. Good working knowledge of ETL/ELT tool. 5.Analysis, design, and development of traditional data warehouse and business intelligence solutions. Work with customers to understand and execute their requirements. 6.Working knowledge of software engineering best practices. Should be willing to work in implementation & support projects. Flexible for Onsite & Offshore traveling. 7.Collaborate with other team members to ensure the proper delivery of the requirement. Ability to think strategically about the broader market and influence company direction. 8.Should have good communication skills, team player & good analytical skills. Snowflake certified is preferable. Soniya soniya05.mississippiconsultants@gmail.com We are a Recruitment firm based in Pune, having various clients globally. Preferred candidate profile
Posted 2 months ago
6 - 10 years
0 - 2 Lacs
Pune, Chennai, Bengaluru
Hybrid
Role & responsibilities: Senior Data Engineer Business domain knowledge : Saas, SFDC, Netsuite Areas we support: Product, Finance (ARR reporting), GTM, Marketing, Sales Tech Stack: Fivetran, Snowflake, dbt, Tableau, Github What we are looking for: 1. SQL and data modeling at intermediate level - write complex SQL queries, build data model and experience with data transformation. 2. Problem solver: Person who can weed through ambiguity of the ask 3. Bias for Action: Asks questions, reaches out to stakeholders, comes up with solutions 4. Communication: Effectively communicates with stakeholder and team members 5.Documentation: Can create BRD 6. Someone well versed in Finance (ARR reporting) and/or GTM (sales and marketing) would be an added advantage 7. Experience in SAAS, NetSuite and Salesforce will be a plus 8. Independent, self-starter, motivated and experience with working in an onsite/offshore environment Key is excellent communication, ownership, working with stakeholders in driving requirements
Posted 2 months ago
6 - 11 years
22 - 35 Lacs
Kolkata, Hyderabad, Bengaluru
Hybrid
Skill Combination: Snowflake + (Python or DBT) + (AWS or Azure) + SQL + Data warehousing Location: Kolkata Exp & CTC: Band Experience CTC Range (Fixed) 4B 4 to 7 years Up to 21 LPA 4C 7 to 11 years Up to 28 LPA 4D 10 to 16 years Up to 35 LPA Inviting applications for the role of Lead Consultant- Snowflake Data Engineer( Snowflake+Python/DBT+Cloud)! In this role, the Snowflake Data Engineer is responsible for providing technical direction and lead a group of one or more developer to address a goal. Job Description: Experience in IT industry Working experience with building productionized data ingestion and processing data pipelines in Snowflake Strong understanding on Snowflake Architecture Fully well-versed with data warehousing concepts. Expertise and excellent understanding of Snowflake features and integration of Snowflake with other data processing. Able to create the data pipeline for ETL/ELT Excellent presentation and communication skills, both written and verbal Ability to problem solve and architect in an environment with unclear requirements. Able to create the high level and low-level design document based on requirement. Hands on experience in configuration, troubleshooting, testing and managing data platforms, on premises or in the cloud. Awareness on data visualisation tools and methodologies Work independently on business problems and generate meaningful insights Good to have some experience/knowledge on Snowpark or Streamlit or GenAI but not mandatory. Should have experience on implementing Snowflake Best Practices Snowflake SnowPro Core Certification will be added an advantage Roles and Responsibilities: Requirement gathering, creating design document, providing solutions to customer, work with offshore team etc. Writing SQL queries against Snowflake, developing scripts to do Extract, Load, and Transform data. Hands-on experience with Snowflake utilities such as SnowSQL, Bulk copy, Snowpipe, Tasks, Streams, Time travel, Cloning, Optimizer, Metadata Manager, data sharing, stored procedures and UDFs, Snowsight, Steamlit Have experience with Snowflake cloud data warehouse and AWS S3 bucket or Azure blob storage container for integrating data from multiple source system. Should have have some exp on AWS services (S3, Glue, Lambda) or Azure services ( Blob Storage, ADLS gen2, ADF) Should have good experience in Python/Pyspark.integration with Snowflake and cloud (AWS/Azure) with ability to leverage cloud services for data processing and storage. Proficiency in Python programming language, including knowledge of data types, variables, functions, loops, conditionals, and other Python-specific concepts. Knowledge of ETL (Extract, Transform, Load) processes and tools, and ability to design and develop efficient ETL jobs using Python or Pyspark. Should have some experience on Snowflake RBAC and data security. Should have good experience in implementing CDC or SCD type-2. Should have good experience in implementing Snowflake Best Practices In-depth understanding of Data Warehouse, ETL concepts and Data Modelling Experience in requirement gathering, analysis, designing, development, and deployment. Should Have experience building data ingestion pipeline Optimize and tune data pipelines for performance and scalability Able to communicate with clients and lead team. Proficiency in working with Airflow or other workflow management tools for scheduling and managing ETL jobs. Good to have experience in deployment using CI/CD tools and exp in repositories like Azure repo , Github etc. Qualifications we seek in you! Minimum qualifications B.E./ Masters in Computer Science, Information technology, or Computer engineering or any equivalent degree with good IT experience and relevant as Snowflake Data Engineer. Skill Metrix: Snowflake, Python/PySpark, DBT, AWS/Azure, ETL concepts, & Data Warehousing concepts
Posted 2 months ago
8 - 18 years
10 - 40 Lacs
Hyderabad, Pune, Delhi / NCR
Work from Office
Roles and Responsibilities : Lead the development of data warehousing solutions using Snowflake on Microsoft Azure platform. Collaborate with cross-functional teams to design, develop, test, and deploy large-scale data pipelines. Ensure high-quality delivery of projects by providing technical guidance and mentorship to junior team members. Participate in code reviews and ensure adherence to coding standards. Job Requirements : 8-18 years of experience in building data warehouses using Snowflake on Microsoft Azure platform. Strong expertise in developing complex SQL queries for query performance tuning. Proficiency in building efficient ETL processes using various tools such as Data Build Tool (DBT). Experience working with big-data technologies like Hadoop, Spark, Kafka.
Posted 2 months ago
5 - 9 years
20 - 25 Lacs
Pune, Gurugram, Bengaluru
Hybrid
We're looking for a motivated and detail-oriented Senior Snowflake Developer with strong SQL querying skills and a willingness to learn and grow with our team. As a Senior Snowflake Developer, you will play a key role in developing and maintaining our Snowflake data platform, working closely with our data engineering and analytics teams. Responsibilities: Write ingestion pipelines that are optimized and performant Manage a team of Junior Software Developers Write efficient and scalable SQL queries to support data analytics and reporting Collaborate with data engineers, architects and analysts to design and implement data pipelines and workflows Troubleshoot and resolve data-related issues and errors Conduct code reviews and contribute to the improvement of our Snowflake development standards Stay up-to-date with the latest Snowflake features and best practices Requirements: 5+ years of experience with Snowflake Strong SQL querying skills, including data modeling, data warehousing, and ETL/ELT design Advanced understanding of data engineering principles and practices Familiarity with Informatica Intelligent Cloud Services (IICS) or similar data integration tools is a plus Excellent problem-solving skills, attention to detail, and analytical mindset Strong communication and collaboration skills, with the ability to work effectively with cross-functional teams Nice to Have: Experience using Snowflake Streamlit, Cortex Knowledge of data governance, data quality, and data security best practices Familiarity with Agile development methodologies and version control systems like Git Certification in Snowflake or a related data platform is a plus
Posted 2 months ago
5 - 7 years
15 - 25 Lacs
Pune, Mumbai (All Areas)
Hybrid
DUTIES AND RESPONSIBILITIES: Build ETL (extract, transform, and loading) jobs using Fivetran and dbt for our internal projects and for customers that use various platforms like Azure, Salesforce, and AWS technologies Monitoring active ETL jobs in production. Build out data lineage artifacts to ensure all current and future systems are properly documented • Assist with the build out design/mapping documentation to ensure development is clear and testable for QA and UAT purposes Assess current and future data transformation needs to recommend, develop, and train new data integration tool technologies Discover efficiencies with shared data processes and batch schedules to help ensure no redundancy and smooth operations Assist the Data Quality Analyst to implement checks and balances across all jobs to ensure data quality throughout the entire environment for current and future batch jobs. Hands-on experience in developing and implementing large-scale data warehouses, Business Intelligence and MDM solutions, including Data Lakes/Data Vaults. SUPERVISORY RESPONSIBILITIES: This job has no supervisory responsibilities. QUALIFICATIONS: Bachelor's Degree in Computer Science, Math, Software Engineering, Computer Engineering, or related field AND 6+ years experience in business analytics, data science, software development, data modeling or data engineering work 3-5 years experience with a strong proficiency with SQL query/development skills Develop ETL routines that manipulate and transfer large volumes of data and perform quality checks Hands-on experience with ETL tools (e.g Informatica, Talend, dbt, Azure Data Factory) • Experience working in the healthcare industry with PHI/PII Creative, lateral, and critical thinker • Excellent communicator Well-developed interpersonal skills Good at prioritizing tasks and time management Ability to describe, create and implement new solutions Experience with related or complementary open source software platforms and languages (e.g. Java, Linux, Apache, Perl/Python/PHP, Chef) Knowledge / Hands-on experience with BI tools and reporting software (e.g. Cognos, Power BI, Tableau) Big Data stack (e.g.Snowflake(Snowpark), SPARK, MapReduce, Hadoop, Sqoop, Pig, HBase, Hive, Flume)fepa
Posted 2 months ago
3 - 8 years
6 - 15 Lacs
Hyderabad
Work from Office
Novastrid is hiring an experienced Data Engineer for a leading Tier-1 company in Hyderabad. If you're passionate about building robust, scalable data systems and working with cutting-edge big data technologies, this is your opportunity to work with one of the best in the industry. Role & responsibilities Design and implement scalable, high-performance batch and real-time data pipelines using Apache Spark , Kafka , Java , and SQL Build and maintain ETL/ELT frameworks handling structured, semi-structured, and unstructured data Work on streaming data solutions using Spark Structured Streaming and Kafka Develop and optimize data models , implement data warehousing solutions on AWS / Azure / GCP Automate and orchestrate workflows using Apache Airflow , DBT , or equivalent tools Collaborate with cross-functional teams (Data Science, Product, Engineering) Monitor, troubleshoot, and ensure reliability of data systems Follow best practices in data governance , security , and cloud cost optimization Preferred candidate profile 3 to 8 years of hands-on experience in Data Engineering / Big Data Development Strong expertise in: Apache Spark Kafka Java (production-grade experience) Advanced SQL Python/Scala (optional but a plus) Experience with cloud platforms (AWS / Azure / GCP) Familiarity with Git , CI/CD pipelines , and modern data ops practices Good to Have Experience with NoSQL (MongoDB, Cassandra, DynamoDB) Exposure to Docker , Kubernetes Domain experience in Banking / FinTech / Financial Services Educational Qualifications Bachelor's or Masters degree in Computer Science , Information Systems , Data Engineering , or a related field
Posted 2 months ago
4 - 6 years
12 - 15 Lacs
Hyderabad
Remote
Job Summary We are looking for a Data Modeler to design and optimize data models supporting automotive industry analytics and reporting. The ideal candidate will work with SAP ECC as a primary data source, leveraging Databricks and Azure Cloud to design scalable and efficient data architectures. This role involves developing logical and physical data models, ensuring data consistency, and collaborating with data engineers, business analysts, and domain experts to enable high-quality analytics solutions. Key Responsibilities: 1. Data Modeling & Architecture: Design and maintain conceptual, logical, and physical data models for structured and unstructured data. 2. SAP ECC Data Integration: Define data structures for extracting, transforming, and integrating SAP ECC data into Azure Databricks. 3. Automotive Domain Modeling: Develop and optimize industry-specific data models covering customer, vehicle, material, and location data. 4. Databricks & Delta Lake Optimization: Design efficient data models for Delta Lake storage and Databricks processing. 5. Performance Tuning: Optimize data structures, indexing, and partitioning strategies for performance and scalability. 6. Metadata & Data Governance: Implement data standards, data lineage tracking, and governance frameworks to maintain data integrity and compliance. 7. Collaboration: Work closely with business stakeholders, data engineers, and data analysts to align models with business needs. 8. Documentation: Create and maintain data dictionaries, entity-relationship diagrams (ERDs), and transformation logic documentation. Skills & Qualifications Data Modeling Expertise: Strong experience in dimensional modeling, 3NF, and hybrid modeling approaches. Automotive Industry Knowledge: Understanding of customer, vehicle, material, and dealership data models. SAP ECC Data Structures: Hands-on experience with SAP ECC tables, business objects, and extraction processes. Azure & Databricks Proficiency: Experience working with Azure Data Lake, Databricks, and Delta Lake for large-scale data processing. SQL & Database Management: Strong skills in SQL, T-SQL, or PL/SQL, with a focus on query optimization and indexing. ETL & Data Integration: Experience collaborating with data engineering teams on data transformation and ingestion processes. Data Governance & Quality: Understanding of data governance principles, lineage tracking, and master data management (MDM). Strong Documentation Skills: Ability to create ER diagrams, data dictionaries, and transformation rules. Preferred Qualifications Experience with data modeling tools such as Erwin, Lucidchart, or DBT. Knowledge of Databricks Unity Catalog and Azure Synapse Analytics. Familiarity with Kafka/Event Hub for real-time data streaming. Exposure to Power BI/Tableau for data visualization and reporting.
Posted 2 months ago
Upload Resume
Drag or click to upload
Your data is secure with us, protected by advanced encryption.
Browse through a variety of job opportunities tailored to your skills and preferences. Filter by location, experience, salary, and more to find your perfect fit.
We have sent an OTP to your contact. Please enter it below to verify.
Accenture
39817 Jobs | Dublin
Wipro
19388 Jobs | Bengaluru
Accenture in India
15458 Jobs | Dublin 2
EY
14907 Jobs | London
Uplers
11185 Jobs | Ahmedabad
Amazon
10459 Jobs | Seattle,WA
IBM
9256 Jobs | Armonk
Oracle
9226 Jobs | Redwood City
Accenture services Pvt Ltd
7971 Jobs |
Capgemini
7704 Jobs | Paris,France