Jobs
Interviews

2818 Snowflake Jobs - Page 5

Setup a job Alert
JobPe aggregates results for easy application access, but you actually apply on the job portal directly.

6.0 - 10.0 years

7 - 10 Lacs

Mumbai, Delhi / NCR, Bengaluru

Work from Office

Data Engineering, AirFlow, Fivetran, CI/CD using We are seeking a Sr. Data Engineer to join our Data Engineering team within our Enterprise Data Insights organization to build data solutions, design and implement ETL/ELT processes and manage our data platform to enable our cross functional stakeholders. As a part of our Corporate Engineering division, ourvision is to spearhead technology and data-led solutions and experiences to drive growth & innovation at scale. The ideal candidate will have a strong Data Engineering background, advanced Python knowledge and experience with cloud services and SQL/NoSQL databases. You will work closely with our cross functional stakeholders in Product, Finance and GTM along with Business and Enterprise Technology teams. As a Senior Data Engineer, you will: Collaborating closely with various stakeholders to prioritize requests, identify improvements, and offer recommendations. Taking the lead in analyzing, designing, and implementing data solutions, which involves constructing and designing data models and ETL processes. Cultivating collaboration with corporate engineering, product teams, and other engineering groups. Leading and mentoring engineering discussions, advocating for best practices. Actively participating in design and code reviews. Accessing and exploring third-party data APIs to determine the data required to meet business needs. Ensuring data quality and integrity across different sources and systems. Managing data pipelines for both analytics and operational purposes. Continuously enhancing processes and policies to improve SLA and SOX compliance. You'll be a great addition to the team if you have: Hold a B.S., M.S., or Ph.D. in Computer Science or a related technical field. Possess over 5 years of experience in Data Engineering, focusing on building and maintaining data environments. Demonstrate at least 5 years of experience in designing and constructing ETL/ELT processes, managing data solutions within an SLA-driven environment. Exhibit a strong background in developing data products, APIs, and maintaining testing, monitoring, isolation, and SLA processes. Possess advanced knowledge of SQL/NoSQL databases (such as Snowflake, Redshift, MongoDB). Proficient in programming with Python or other scripting languages. Have familiarity with columnar OLAP databases and data modeling. Experience in building ELT/ETL processes using tools like dbt, AirFlow, Fivetran, CI/CD using GitHub, and reporting in Tableau. Possess excellent communication and interpersonal skills to effectively collaborate with various business stakeholders and translate requirements. Added bonus if you also have: A good understanding of Salesforce & Netsuite systems Experience in SAAS environments Designed and deployed ML models Experience with events and streaming data Location- Remote,Delhi NCR,Bengaluru,Chennai,Pune,Kolkata,Ahmedabad,Mumbai, Hyderabad

Posted 3 days ago

Apply

2.0 - 12.0 years

12 - 16 Lacs

Bengaluru

Work from Office

About Us: Narwal, with its Global Delivery Model, strategically expands its reach across North America, the United Kingdom, and an offshore development centre in India Delivery cutting edge AI, Data and Quality Engineering solutions and consistently surpassing expectations, Narwal has achieved remarkable triple-digit growth rates year after year, earning accolades such as Inc 5000, Best IT Services Company, Best Data Technology Company, and Partner of the Year with Tricentis, Our Vision: To be an expert in AI, Data, Cloud and Quality Engineering transformations, bold in our thinking and authentic in our relationships, Key Responsibilities: - Hands on development utilizing AWS tools, Snowflake, DBT, Astronomer Lead smaller team of data engineers to deliver data programs and provide guidance in technical design, architecture, troubleshooting and deployment Effectively engage with senior management and business partners to own and deliver solutions Proficient in python scripting and stored procedures Manage and develop processes for batch and real time using Five Tran and Kafka Experience in orchestration, monitoring tools like DBTs, Astronomer and Github version control Experience in leveraging GitHub Co-pilot for efficient code generation Proficiency in leveraging compute and storage resources in Cloud Data Platform for delivering cost effective data solutions Ability to deliver solutions with focus on data quality, performance and master data management Partner with the business teams/stakeholders to understand their strategic goals and help them drive value through data/analytics capability and technology Take accountability and ownership for all the solutions delivered with sense of urgency Ability to incorporate industry standard technology solutions into day-to-day delivery What you bring: 10+ years of related experience with bachelors degree or 5 years and a masters degree 5+ plus years of experience in ETL, data warehousing concepts (on-prem and cloud), database programming 2 plus years of experience in Snowflake/Python scripting Experience with any scripting languages, preferably JavaScript and Shell Scripting, AWS Services such as S3, EC2, Lambda, SQS and SNS Working knowledge of reporting tools like Tableau, Athena, Webfocus, Power BI Excellent communication and presentation skills, able to tailor results into language appropriate for the target audience Experience working in all stages of mature Agile Software Development Life Cycle Experience in handling very large Datasets and diverse data formats like parquet, iceberg, json Extensive experience capturing and translating complex business requirements into practical solutions Why Narwal Opportunity to shape the future of a rapidly growing company, Competitive salary and benefits package, A supportive and inclusive company culture, Collaborative and innovative work environment, Access to professional development and growth opportunities, Certified as a Great Place to Work Narwal is an equal opportunity employer We celebrate diversity and are committed to creating an inclusive environment for all employees For more information please visit: https://narwal ai/ Show

Posted 3 days ago

Apply

1.0 - 7.0 years

5 - 9 Lacs

Bengaluru

Work from Office

At Warner Music Group, Were a Global Collective Of Music Makers And Music Lovers, Tech Innovators And Inspired Entrepreneurs, Game-changing Creatives And Passionate Team Members Here, We Turn Dreams Into Stardom And Audiences Into Fans We Are Guided By Three Core Values That Underpin Everything We Do Across All Our Diverse Businesses Curiosity: We do our best work when were immersing ourselves in culture and breaking through barriers Curiosity is the driving force behind creativity and ingenuity It fuels innovation, and innovation is the key to our future, Collaboration: Making music and bringing it to the world is all about the power of originality amplified by teamwork A great idea, like a great song, travels globally We ignite passions and build connections across our diverse community of artists, songwriters, partners, and fans, Commitment: We pursue excellence for our team and our talent Everything in music starts with a leap into the unknown, and were committed to keeping the faith, acting with integrity, and delivering on our promises, Technology is one of the most important parts of our business Whether its signing up new artists; ensuring we provide the right data to Spotify, YouTube, and other digital service providers; or helping artists use the latest AI tools and make thoughtful decisions with data-driven insights technology plays an invaluable role in our success The engineering team at Warner Music Group makes all of it a reality, WMG is home to a wide range of artists, musicians, and songwriters that fuel our success That is why we are committed to creating a work environment that actively values, appreciates, and respects everyone We encourage applications from people with a wide variety of backgrounds and experiences, Consider a career at WMG and get the best of both worlds an innovative global music company that retains the creative spirit of a nimble independent, Your Role We are building the next-generation Data Platform that sets the standard for freshness, accuracy, comprehensiveness, and ease of use to power Warner Music Groups business This is a unique opportunity to be a part of a brand new, high-performing engineering center of excellence and drive significant impact for WMG's technology initiatives This role will collaborate closely with our North American Engineering teams to synchronize strategies, processes, and objectives This is a hybrid position that requires you to work onsite at our Bangalore office a few days per week We are reimagining this platform from the ground up, including: How we store, represent, ingest and serve factual data about artists, songwriters, and their works How we ingest, process, ETL, and serve data about music consumption across all digital platforms we partner with How we represent the relationships between artists, songwriters, and their works to power next-generation search and discovery for our analysts and business partners, Our Data Platform is the foundation of all we do at Warner Music Group, feeding core business processes like marketing optimization, performance tracking, and rights acquisition and distribution; next-generation capabilities like artist-fan connection, trend analysis and other machine learning powered applications; and even advanced capabilities like generative AI applications in music, We need strong engineers who love music, data, and building world-class systems that scale to solve data problems, Responsibilities Reimagine and implement the future of tech for the music industry, building an all new codebase Work as part of a dynamic and highly effective team Own the creation and delivery of highly innovative products Learn and grow as a professional through close collaboration with your team members and engineering leaders, and by being part of culture of continuous improvement and learning About You You have an undergraduate or graduate degree in Computer Science, Computer Engineering, or other related field You have at least 6 years of experience in backend development or data engineering You have built or developed large-scale data processing pipelines and/or large, high-availability dimensional datastores Experience with Snowflake or Databricks is a plus You are passionate about music and have a deep desire to provide the data that will help bring more great music into the world You have a high sense of ownership and a drive to deliver impact in a fast-paced, evolving, ambiguous environment You have a drive to grow, learn, and master the craft of software development As the home to 10K Projects, Asylum, Atlantic Music Group, East West, FFRR, Fueled by Ramen, Nonesuch, Parlophone, Rhino, Roadrunner, Sire, Warner Records, Warner Classics, and several other of the worlds premier recording labels, Warner Music Group champions emerging artists and global superstars alike And our renowned publishing company, Warner Chappell Music, represents genre-spanning songwriters and producers through a catalog of more than one million copyrights worldwide Warner Music Group is also home to ADA, which supports the independent community, as well as artist services division WMX In addition, WMG counts film and television storytelling powerhouse Warner Music Entertainment among its many brands, Together, we are Warner Music Group: Independent Minds Major Sound, Love this job and want to apply Click the ?Apply? link at the top of the page, or apply directly with your LinkedIn Applying with LinkedIn will import all of the information you put in your profile, but will still allow you to upload a resume and cover letter, Dont be discouraged if you dont hear from us right away Were taking our time to review all resumes, and to find the best people for WMG, Thanks for your interest in working for WMG We love it here, and think you will, too, WMG is committed to inclusion and diversity in all aspects of our business We are proud to be an equal opportunity workplace and will evaluate qualified applicants without regard to race, religious creed, color, age, sex, sexual orientation, gender, gender identity, gender expression, national origin, ancestry, marital status, medical condition as defined by state law (genetic characteristics or cancer), physical or mental disability, military service or veteran status, pregnancy, childbirth and related medical conditions, genetic information or any other characteristic protected by applicable federal, state or local law, Copyright 2025 Warner Music Inc, Show

Posted 3 days ago

Apply

7.0 - 12.0 years

14 - 24 Lacs

Pune

Work from Office

vConstruct, a Pune-based Construction Technology company is seeking a Senior Data Engineer for its Data Science and Analytics team, a close-knit group of analysts and engineers supporting all data aspects of the business. You will be responsible for designing, developing, and maintaining our data infrastructure, ensuring data integrity, and supporting various data-driven projects. You will work closely with cross-functional teams to integrate, process, and manage data from various sources, enabling business insights and enhancing operational efficiency. Responsibilities Lead the end-to-end design and development of scalable, high-performance data pipelines and ETL/ELT frameworks aligned with modern data engineering best practices. Architect complex data integration workflows that bring together structured, semi-structured, and unstructured data from both cloud and on-premise sources. Build robust real-time, batch, and on-demand pipelines with built-in observabilitymonitoring, alerting, and automated error handling. Partner with analysts, data scientists, and business leaders to define and deliver reliable data models, quality frameworks, and SLAs that power key business insights. Ensure optimal pipeline performance and throughput, with clearly defined SLAs and proactive alerting for data delivery or quality issues. Collaborate with platform, DevOps, and architecture teams to build secure, reusable, and CI/CD-enabled data workflows that align with enterprise architecture standards. Establish and enforce the best practices in source control, code reviews, testing automation, and continuous delivery for all data engineering components. Lead root cause analysis (RCA) and preventive maintenance for critical data failures, ensuring minimal business impact and continuous service improvement. Guide the team in establishing standards for data modeling, transformation logic, and governance, ensuring long-term maintainability and scalability. Design and execute comprehensive testing strategiesunit, integration, and system testingensuring high data reliability and pipeline resilience. Monitor and fine-tune data pipeline and query performance, optimizing for reliability, scalability, and cost-efficiency. Create and maintain detailed technical documentation, including data architecture diagrams, process flows, and integration specifications for internal and external stakeholders. Facilitate and lead discussions with business and operational teams to understand data requirements, prioritize initiatives, and drive data strategy forward. Qualifications 7 to 10 years of hands-on experience in data engineering roles with a proven record of building scalable and secure data platforms. Over 5 years of experience in scripting languages such as Python for data processing, automation, and ETL development. 4+ years of experience with Snowflake, including performance tuning, security model design, and advanced SQL development. 5+ years of experience with data integration tools such as Azure Data Factory, Fivetran, or Matillion. 5+ years of experience in writing complex, highly optimized SQL queries on large datasets. Proven experience integrating and managing APIs, JSON, XML, and webhooks for data acquisition. Hands-on experience with cloud platforms (Azure/AWS) and orchestration tools like Apache Airflow or equivalent. Experience with CI/CD pipelines, automated testing, and code versioning tools (e.g., Git). Familiarity with dbt or similar transformation tools and best practices for modular transformation development. Exposure to data visualization tools like Power BI for supporting downstream analytics is a plus. Strong interpersonal and communication skills with the ability to lead discussions with technical and business stakeholders. Education Bachelor’s or Master’s degree in Computer Science/Information technology or related field. Equivalent academic and work experience can be considered. About vConstruct : vConstruct specializes in providing high quality Building Information Modeling and Construction Technology services geared towards construction projects. vConstruct is a wholly owned subsidiary of DPR Construction. For more information, please visit www.vconstruct.com About DPR Construction: DPR Construction is a national commercial general contractor and construction manager specializing in technically challenging and sustainable projects for the advanced technology, biopharmaceutical, corporate office, and higher education and healthcare markets. With the purpose of building great things, great teams, great buildings, great relationships—DPR is a truly great company. For more information, please visit www.dpr.com

Posted 3 days ago

Apply

3.0 - 8.0 years

4 - 8 Lacs

Hyderabad

Work from Office

Lexington Partners is one of the worlds largest and most trusted managers of secondary private equity and co-investment funds. Since our founding in 1994, we have been at the forefront of innovation in private equity investing, managing over $70 billion in committed capital and partnering with a global network of institutional investors, private equity firms, and portfolio companies. What are the ongoing responsibilities of Associate Software Engineer (Data Engineer) responsible for? We are building a growing Data and AI team. You will play a critical role in the efforts to centralize structured and unstructured data for the firm. We seek a candidate with skills in data modeling, data management and data governance, and can contribute first-hand towards firms data strategy. The ideal candidate is a self-starter with a strong technical foundation, a collaborative mindset, and the ability to navigate complex data challenges #ASSOCIATE What ideal qualifications, skills & experience would help someone to be successful? Bachelors degree in computer science or computer applications; or equivalent experience in lieu of degree with 3 years of industry experience. Strong expertise in data modeling and data management concepts. Experience in implementing master data management is preferred. Sound knowledge on Snowflake and data warehousing techniques. Experience in building, optimizing, and maintaining data pipelines and data management frameworks to support business needs. Proficiency in at least one programming language, preferably python. Collaborate with cross-functional teams to translate business needs into scalable data and AI-driven solutions. Take ownership of projects from ideation to production, operating in a startup-like culture within an enterprise environment. Excellent communication, collaboration, and ownership mindset. Foundational Knowledge of API development and integration. Knowledge of Tableau, Alteryx is good-to-have. Work Shift Timings - 2:00 PM - 11:00 PM IST

Posted 3 days ago

Apply

5.0 - 10.0 years

17 - 27 Lacs

Bengaluru

Work from Office

Job Description: Snowflake Data Engineer Location - Bengaluru Snowflake Data Engineer with 510 years of experience in data engineering and analytics, including at least 4+ years of hands-on experience designing and developing data pipelines and solutions on the Snowflake Data Cloud platform. Strong proficiency in Python for data processing and automation is essential. Must Have Skills: Strong experience in Snowflake Data Cloud, including data modeling, performance tuning, and advanced features like Time Travel, Snowpipe, and Data Sharing. Proficient in Python for data processing, scripting, and utility development. Experience in building and optimizing ETL/ELT pipelines using Snowflake and cloud-native tools. Strong SQL skills for data transformation, validation, and analytics. Working knowledge of AWS services such as S3, Glue, Lambda, and Athena. Experience with CI/CD pipelines and version control tools like Git. Ability to troubleshoot and optimize data workflows for performance and reliability. Good to Have Skills: SnowPro Core certification or equivalent data engineering certifications. Exposure to Apache Spark for distributed data processing. Domain: Experience in Telecom domain is preferred, especially with billing systems, CDR processing, and reconciliation workflows. Role & Responsibilities: Design and develop scalable data pipelines and analytics solutions using Snowflake and Python. Collaborate with data architects and analysts to understand requirements and translate them into technical solutions. Implement data ingestion, transformation, and curation workflows using Snowflake and AWS services. Ensure data quality, integrity, and compliance through robust validation and monitoring processes. Participate in performance tuning and optimization of Snowflake queries and pipelines. Support UAT and production deployments, including troubleshooting and issue resolution. Document technical designs, data flows, and operational procedures for internal and client use. Qualifications: Bachelors degree in Computer Science, Information Technology, or related field. Strong communication skills to interact with technical and business stakeholders. Ability to present and defend technical solutions with clarity and confidence. Detail-oriented with a passion for building reliable and efficient data systems. Role & responsibilities Preferred candidate profile

Posted 3 days ago

Apply

4.0 - 6.0 years

3 - 8 Lacs

Hyderabad

Work from Office

What is Inv Risk Data Management team responsible for? The Risk Analyst primary function is to compile data and reports for risk analysis, identify and reconcile data and modeling discrepancies, Create and manage data visualizations, and ensure accurate reports and data are delivered to their intended audiences within a defined timeframe. Risk Analysts have in depth knowledge and understanding of a specific investment strategy. What are the ongoing responsibilities of Risk Analyst? Data Validation & Maintenance: Support a specific data model and asset class focus. Identify, reconcile, and resolve data issues of low complexity. Look for errors in data models. Analyze and understand existing internal tools and data warehouses to identify data quality. Analyze and understand existing internal tools and data warehouses to confirm data quality. Review automated validation controls and complete issue resolution. Assist with setup of new accounts. Reporting: Run existing standard reports and queries from Risk systems and databases. Ensure reports are delivered to the appropriate client(s) and/or provided via automated processes to downstream systems according to defined SLAs and time frames. Review, understand, and respond to basic ad-hoc requests for risk statistical information supporting Risk, Investment Management, Marketing, and other constituent teams. Work closely with Technology team to test production enhancements to the Risk systems and reports. Data Analytics: Manage existing analytics and create new ones if needed. Respond to specific requests for portfolio characteristics and risk statistics information. What ideal qualifications, skills & experience would help someone to be successful? Bachelors degree in finance, statistics, mathematics, operations research, engineering, or computer science, or related field Higher education or relevant industry certifications like CFA, FRM, preferable 4 to 6 years relevant work experience in the Asset Management Industry, in particular working in the front office with exposure to investment, trading, Portfolio and risk data Data Quality, Data Analytics and/or Data Management experience preferred Database and SQL (Structured Query Language), Tableau or Power BI and any programming language experience required. Knowledge, Skills and Abilities: Database/ SQL (Azure, Snowflake, AWS): Ability to use basic functionality to collect data from a single or multiple sources Data Science/ Analytics (Excel, Databricks, coding, Python, Machine learning, AI): Ability to use basic coding functionality to compile, clean and search through large data sets for usable information Visualization (Power BI, Tableau): Ability to use basic screens to help visualize the information Data Modelling (Barra, Port, Axioma): Ability to model portfolio securities terms and conditions appropriately in selected risk system and create user defined instruments for commonly used derivatives. Business & Risk Knowledge: Cursory knowledge of investment management and investment risk concepts. Eager to seek out organized educational opportunities pertaining to business, risk, financial services, and investment management. Demonstrates ability to use interactions with peers as occasions to gain knowledge. Industry Trends: Learning and skill development stage of industry trend awareness. Demonstrates eagerness to learn about data modeling, research, and insight methods by gaining knowledge from peers and organized educational opportunities. Initiative, Organization & Time Management: Good organization & time management skills. Ability to prioritize work and deliverables to meet committed timelines. Communication: Effective written and verbal communication skills Problem Solving and Decision Making: Ability to make independent decisions related to day to day job duties Ability to independently solve problems of moderate scope and complexity Travel Requirements: Possibly on occasion and could require global travel Physical Requirements: Ability to hear and speak to employees and outside business associates on the phone and in person Ability to view letters and numbers on a computer screen for long hours at a time Ability to maintain a professional image Job Level - Individual Contributor Work Shift Timings - 2:00 PM 11:00 PM IST

Posted 3 days ago

Apply

5.0 - 10.0 years

25 - 40 Lacs

Noida, Gurugram, Delhi / NCR

Work from Office

Responsibilities: Design Snowflake data warehouses and ETL processes using Informatica and Python. Collaborate with Salesforce teams on data integration projects.

Posted 3 days ago

Apply

5.0 - 10.0 years

25 - 40 Lacs

Pune

Work from Office

Responsibilities: Design Snowflake data warehouses and ETL processes using Informatica. Develop Python scripts for data automation and analysis on Salesforce platform.

Posted 3 days ago

Apply

5.0 - 10.0 years

25 - 40 Lacs

Bengaluru

Work from Office

Responsibilities: Design and implement data architecture using Snowflake, Python, Salesforce, Informatica. Ensure data security and compliance standards met. Collaborate with cross-functional teams on project delivery.

Posted 3 days ago

Apply

3.0 - 6.0 years

6 - 10 Lacs

Hyderabad

Work from Office

As a Systems Analyst Business Intelligence on the Data & AI team at Lexington Partners, you will contribute to delivering business intelligence and data analytics solutions that drive meaningful business impact. The ideal candidate has a growth and learning mindset, is detail-oriented, and possesses strong analytical skills to navigate complex challenges. You should be comfortable working in a fast-paced, startup-like environment within an established enterprise and able to quickly adapt to new tools and solutions. You will play a pivotal role in the firms business intelligence initiatives enabling data-driven decision-making. Key Responsibilities: Build dashboards and analytics using Power BI, Power Query, and related tools in the ecosystem. Collaborate closely with Finance and Technology teams to deliver BI solutions. Perform system administration of Business Intelligence platforms (e.g., Power BI), including user security management and maintenance. Own regular and ad hoc operational reporting, driving continuous improvement and scalability. Work collaboratively with various business functions to understand reporting needs and deliver actionable insights. Develop and maintain scalable, automated dashboards and reports to benchmark and track progress against key operational goals and initiatives. Perform data modeling by modifying existing models or creating new ones as needed. Assist end users in building ad hoc reports through BI tools. What ideal qualifications, skills & experience would help someone to be successful? Bachelors degree in Business Administration, Finance, Information Management, Computer Science, or a related field. Strong attention to detail and analytical thinking skills. Must be a gate-keeper for high data quality. Proficiency in SQL, PowerBI, PowerQuery, ETL techniques. Experience in Microsoft Azure and Power Platform is a plus. Knowledge of Snowflake is good to have. Versatile and Effective Communicator able to interact with diverse group of individuals with very different styles and back grounds. Ability to implement row-level security on data and a strong understanding of application security layer models in Power BI. Capable of working independently as well as in a team-oriented environment. Excellent written and verbal communication skills, with a collaborative and ownership-driven mindset.

Posted 3 days ago

Apply

15.0 - 20.0 years

10 - 15 Lacs

Mumbai, Gurugram, Bengaluru

Work from Office

S&C GN - Tech Strategy & Advisory - Cloud Architecture - Senior Manager Global Network a distributed management consulting organization that provides management consulting and strategy expertise across the client lifecycle. Approximately 10,000 consultants are part of this rapidly expanding network, providing specialized and strategic industry and functional consulting expertise from key locations around the world. Our Global Network teams complement our in-country teams to deliver cutting-edge expertise and measurable value to clients all around the world.For more information visit . Practice Overview: Skill/Operating Group Technology Consulting Level Senior Manager Location Gurgaon/Mumbai/Bangalore/Pune/Kolkata Travel Percentage Expected Travel could be anywhere between 0-100% Principal Duties And Responsibilities: Working closely with our clients, Consulting professionals design, build and implement strategies that can help enhance business performance. They develop specialized expertisestrategic, industry, functional, technicalin a diverse project environment that offers multiple opportunities for career growth.The opportunities to make a difference within exciting client initiatives are limitless in this ever-changing business landscape.Here are just a few of your day-to-day responsibilities. Providing thought leadership in Cloud Architecture and Data Modernization, with a strong focus on industry-specific trends and best practices. To lead large scale Cloud Assessment, Migration and Application Modernization Projects Design an Application Modernization Strategy for Enterprises to meet Business Goals Build strategy and roadmap for Migration to Cloud Design the Application Modernization Architecture on cloud solutions such as AWS, Azure, GCP, Ali Cloud Deep understanding of Microservices and EDA To lead Large Scale Big Data Modernization Strategy and Architecture Engagements. Design Holistic Data Strategy to help enterprises to meet their Business goals. Architect large scale data lake, DW, and Delta Lake on cloud solutions using AWS, Azure, GCP, Ali Cloud, Snowflake, Hadoop, or Cloudera Design Data Mesh Strategy and Architecture Build strategy and roadmap for data migration to cloud Establish Data Governance Strategy & Operating Model Implementing programs/interventions that prepare the organization for implementation of new business processes Provide thought leadership to the downstream teams for developing offerings and assets Identifying, assessing, and solvingcomplex business problemsfor area of responsibility, where analysis of situations or data requires an in-depth evaluation of variable factors Overseeing the production and implementation of solutions covering multiple cloud technologies, associated Infrastructure / application architecture, development, and operating models Driving enterprise business, application, and integration architecture Helping solves key business problems and challenges by enabling a cloud-based architecture transformation, painting a picture of, and charting a journey from the current state to a to-be enterprise environment Assisting our clients to build the required capabilities for growth and innovation to sustain high performance Managing multi-disciplinary teams to shape, sell, communicate, and implement programs Experience in participating in client presentations & orals for proposal defense etc. Experience in effectively communicating the target state, architecture & topology on cloud to clients. Qualifications: Bachelors degree MBA Degree from Tier-1 College (Preferable) 15-20 years of large-scale consulting experience and/or working with hi tech companies leading Projects on Cloud Strategy, Assessment, Cloud architecture, App Modernization, Containers, information security and information management. Experience in data architecture, data governance, data mesh, data security and management. Bachelors degree MBA Degree from Tier-1 College (Preferable) 15-20 years of large-scale consulting experience and/or working with hi tech companies leading Projects on Cloud Strategy, Assessment, Cloud architecture, App Modernization, Containers, information security and information management. Experience in data architecture, data governance, data mesh, data security and management. Certified on DAMA (Data Management) or Azure Data Architecture or Google Cloud Data Analytics or AWS Data Analytics Architect Certification on Azure/AWS/GCP Experience: We are seeking experienced professionals who have led large-scale engagements in application modernization, cloud architecture, and cloud strategy. The ideal candidates will have technical expertise in cloud strategy, assessments, cloud architecture, application modernization, and containers, across all stages of the innovation lifecycle, with a focus on shaping the future in real-time. The candidate should have practical industry expertise in one of these areas - Financial Services, Retail, consumer goods, Telecommunications, Life Sciences, Transportation, Hospitality, Automotive/Industrial, Mining and Resources. Key Competencies and Skills: The right candidate should have competency and skills aligned to one or more of these archetypes Application Assessment & Migration - Experience leading and Large scale assessment and cloud migration projects Application Modernization Experience leading and designing the composable architecture leveraging Microservices, EDA on the Cloud Platforms. Cloud Architecture :Experience with private, public and hybrid cloud architectures, pros/cons, and Hybrid cloud integration architecture Cloud-native application development, DevOps and data integration within cloud platform Cloud Migration :Delivering cloud migration roadmaps and managing execution from a project management perspective. Cloud Deployment across various hyper-scaler platforms (AWS, Azure, GCP) and models (PaaS, SaaS, IaaS), containers and virtualization platforms (VMWare.) Data SME - Experience in deal shaping & strong presentation skills, leading proposal experience, customer orals; technical understanding of data platforms, data on cloud strategy, data strategy, data operating model, change management of data transformation programs, data modeling skills. Data on Cloud Architect - Technical understanding of data platform strategy for data on cloud migrations, big data technologies, experience in architecting large scale data lake and DW on cloud solutions. Experience one or more technologies in this space : AWS, Azure, GCP, AliCloud, Snowflake, Hadoop, Cloudera Data Strategy - Data Capability Maturity Assessment, Data & Analytics / AI Strategy, Data Operating Model & Governance, Data Hub Enablement, Data on Cloud Strategy, Data Architecture Strategy Data Transformation Lead - Understanding of data supply chain and data platforms on cloud, experience in conducting alignment workshops, building value realization framework for data transformations, program management experience Exceptional interpersonal and presentation skills - ability to convey technology and business value propositions to senior stakeholders Capacity to develop high impact thought leadership that articulates a forward-thinking view of the market Other desired skills - Strong desire to work in technology-driven business transformation Strong knowledge of technology trends across IT and digital and how they can be applied to companies to address real world problems and opportunities. Comfort conveying both high-level and detailed information, adjusting the way ideas are presented to better address varying social styles and audiences. Leading proof of concept and/or pilot implementations and defining the plan to scale implementations across multiple technology domains Flexibility to accommodate client travel requirements Published Thought leadership Whitepapers, POVs

Posted 3 days ago

Apply

4.0 - 6.0 years

15 - 18 Lacs

Mumbai, Bengaluru, Delhi / NCR

Work from Office

Job Summary: We are looking for a detail-oriented and proactive Senior Analyst Data Analytics to join our team in Mumbai. This non-engineering role focuses on leveraging Snowflake, Databricks, and Power BI to deliver actionable insights, create dashboards, and support data-driven decision-making across the business. Key Responsibilities: Analyze and interpret data using Snowflake and Databricks to provide strategic insights Design and develop impactful dashboards and visualizations using Power BI Collaborate with stakeholders to understand business requirements and translate them into analytical solutions Identify trends, patterns, and opportunities for business improvements Ensure accuracy, integrity, and consistency of data used for reporting and analytics Deliver clear and concise reports to business leaders and teams Required Skills: 3+ years of hands-on experience in data analytics/business intelligence Proficient in Snowflake and Databricks (using SQL and/or Python) Strong expertise in Power BI report building, DAX functions, and dashboard design Solid understanding of data modeling, KPIs, and data storytelling Strong SQL skills Excellent communication and analytical thinking skills Ability to manage multiple tasks and work cross-functionally Location-Delhi NCR,Bangalore,Chennai,Pune,Kolkata,Ahmedabad,Mumbai,Hyderabad

Posted 3 days ago

Apply

12.0 - 17.0 years

12 - 17 Lacs

Pune

Work from Office

Role Overview: The Technical Architect specializes in Traditional ETL tools such as Informatica Intelligent Cloud Services (IICS), and similar technologies. The jobholder designs, implements, and oversees robust ETL solutions to support our organization's data integration and transformation needs. Responsibilities: Design and develop scalable ETL architectures using tools like IICS, and other traditional ETL platforms. Collaborate with stakeholders to gather requirements and translate them into technical solutions. Ensure data quality, integrity, and security throughout the ETL processes. Optimize ETL workflows for performance and reliability. Provide technical leadership and mentorship to development teams. Troubleshoot and resolve complex technical issues related to ETL processes. Document architectural designs and decisions for future reference. Stay updated with emerging trends and technologies in ETL and data integration. Key Technical Skills & Responsibilities 12+ years of experience in data integration and ETL development, with at least 3 years in an Informatica architecture role. Extensive expertise in Informatica PowerCenter, IICS, and related tools (Data Quality, EDC, MDM). Proven track record of designing ETL solutions for enterprise-scale data environments Advanced proficiency in Informatica PowerCenter and IICS for ETL/ELT design and optimization. Strong knowledge of SQL, Python, or Java for custom transformations and scripting. Experience with data warehousing platforms (Snowflake, Redshift, Azure Synapse) and data lakes. Familiarity with cloud platforms (AWS, Azure, GCP) and their integration services. Expertise in data modeling, schema design, and integration patterns. Knowledge of CI/CD, Git, and infrastructure-as-code (e.g., Terraform) Experience of working on proposals, customer workshops, assessments etc is preferred Must have good communication and presentation skills Primary Skills: Informatica,IICS Data Lineage and Metadata Management Data Modeling Data Governance Data integration architectures Informatica data quality Eligibility Criteria: Bachelor's degree in Computer Science, Information Technology, or a related field. Proven experience in ETL architecture and development using tools like IICS, etc. Strong understanding of data integration, transformation, and warehousing concepts. Proficiency in SQL and scripting languages. Experience with cloud-based ETL solutions is a plus. Familiarity with Agile development methodologies. Excellent problem-solving and analytical skills. Strong communication and leadership abilities. Knowledge of data governance and compliance standards. Ability to work in a fast-paced environment and manage multiple priorities.

Posted 3 days ago

Apply

5.0 - 10.0 years

25 - 40 Lacs

Gurugram

Work from Office

Job Title: Data Engineer Job Type: Full-time Department: Data Engineering / Data Science Reports To: Data Engineering Manager / Chief Data Officer About the Role: We are looking for a talented Data Engineer to join our team. As a Data Engineer, you will be responsible for designing, building, and maintaining robust data pipelines and systems that process and store large volumes of data. You will collaborate closely with data scientists, analysts, and business stakeholders to deliver high-quality, actionable data solutions. This role requires a strong background in data engineering, database technologies, and cloud platforms, along with the ability to work in an Agile environment to drive data initiatives forward. Responsibilities: Design, build, and maintain scalable and efficient data pipelines that move, transform, and store large datasets. Develop and optimize ETL processes using tools such as Apache Spark , Apache Kafka , or AWS Glue . Work with SQL and NoSQL databases to ensure the availability, consistency, and reliability of data. Collaborate with data scientists and analysts to ensure data requirements and quality standards are met. Design and implement data models, schemas, and architectures for data lakes and data warehouses. Automate manual data processes to improve efficiency and data processing speed. Ensure data security, privacy, and compliance with industry standards and regulations. Continuously evaluate and integrate new tools and technologies to enhance data engineering processes. Troubleshoot and resolve data quality and performance issues. Participate in code reviews and contribute to a culture of best practices in data engineering. Requirements: 3-10 years of experience as a Data Engineer or in a similar role. Strong proficiency in SQL and experience with NoSQL databases (e.g., MongoDB, Cassandra). Experience with big data technologies such as Apache Hadoop , Spark , Hive , and Kafka . Hands-on experience with cloud platforms like AWS , Azure , or Google Cloud . Proficiency in Python , Java , or Scala for data processing and scripting. Familiarity with data warehousing concepts, tools, and technologies (e.g., Snowflake , Redshift , BigQuery ). Experience working with data modeling, data lakes, and data pipelines. Solid understanding of data governance, data privacy, and security best practices. Strong problem-solving and debugging skills. Ability to work in an Agile development environment. Excellent communication skills and the ability to work cross-functionally.

Posted 3 days ago

Apply

6.0 - 10.0 years

15 - 20 Lacs

Hyderabad, Chennai, Bengaluru

Work from Office

Key Responsibilities: • Design, develop, and optimize data pipelines and ETL processes for data warehousing projects. • Work extensively with Snowflake, ensuring efficient data modeling, and query optimization. • Develop and manage data workflows using Azure Data Factory (ADF) for seamless data integration. • Implement data transformations, testing, and documentation using dbt. • Collaborate with cross-functional teams to ensure data accuracy, consistency, and security. • Troubleshoot data-related issues. • (Optional) Utilize Python for scripting, automation, and data processing tasks. Required Skills & Qualifications: • Experience in Data Warehousing with a strong understanding of best practices. • Hands-on experience with Snowflake (Data Modeling, Query Optimization). • Proficiency in Azure Data Factory (ADF) for data pipeline development. • Strong working knowledge of dbt (Data Build Tool) for data transformations. • (Optional) Experience in Python scripting for automation and data manipulation. • Good understanding of SQL and query optimization techniques. • Experience in cloud-based data solutions (Azure). • Strong problem-solving skills and ability to work in a fast-paced environment. Experience with CI/CD pipelines for data engineering.

Posted 3 days ago

Apply

5.0 - 9.0 years

0 Lacs

karnataka

On-site

You will be responsible for designing, building, and configuring Snowflake integrations to align with business processes and application requirements. This will involve leading discussions with clients, conducting workshops, managing project integration deliverables, and collaborating with other teams. Additionally, you will analyze the current landscape and redesign Snowflake integrations based on evolving requirements. Your expertise should include a strong understanding of Snowflake Insert, Update, and Merge commands, as well as familiarity with Snowflake Stages such as User Stage, Table Stage, Named Stage, and External Stage. You should also have experience working with various types of Snowflake tables and views, including External tables. In this role, you will be expected to leverage your integration experience to develop Snowflake queries for data extraction from SAP ECC and S4 applications into a planning system. You should also possess knowledge of the touchpoints between Kinaxis and SAP, enabling seamless data integration between the two platforms. Bristlecone, the organization you will be joining, is a leading provider of AI-powered application transformation services for the connected supply chain. Their solutions in Digital Logistics, Cognitive Manufacturing, Autonomous Planning, Smart Procurement, and Digitalization are designed to enhance speed, visibility, automation, and resiliency in response to industry changes. As an Equal Opportunity Employer, Bristlecone values diversity and inclusivity in its workforce. They are committed to providing a supportive and inclusive work environment for all employees. In this role, you will be expected to understand and adhere to Information Security policies, guidelines, and procedures to safeguard organizational data and Information Systems. This includes participating in information security training, promptly reporting any suspected security breaches, and fulfilling additional information security responsibilities specific to your job role.,

Posted 3 days ago

Apply

12.0 - 18.0 years

0 Lacs

pune, maharashtra

On-site

As a Data Engineer with 12 to 18 years of experience, you will be responsible for working remotely on a 3-month extendable project focusing on Data Warehousing (DWH), ETL, GCP, and CDP as an Architect. Your role will involve a deep understanding of customer data models, behavioral analytics, segmentation, and machine learning models. You should have expertise in APIs integration, real-time event processing, and data pipelines. The ideal candidate will have prior experience in ETL and DWH, along with a strong background in designing and implementing solutions in cloud environments like GCP and Google CDP data platforms such as Snowflake and BigQuery. Experience in developing customer-facing user interfaces using BI Tools like Google Looker, Power BI, or other open-source tools is essential. You should have a track record of Agile delivery, be self-motivated, and possess strong communication and interpersonal skills. As a motivated self-starter, you should be adept at adapting to changing priorities and be able to think quickly to design and deliver effective solutions. To excel in this role, you should ideally have experience as a Segment CDP platform developer and a minimum of 15-18 years of relevant experience with a degree in B.Tech/MCA/M.Tech. If you are looking for a challenging opportunity to leverage your expertise in data engineering, analytics, and cloud platforms, this role offers an exciting prospect to contribute to a dynamic project.,

Posted 3 days ago

Apply

7.0 - 11.0 years

0 Lacs

karnataka

On-site

You will be working as a Mid-Level Full Stack Developer in the field of Data Visualization & Reporting with expertise in tools like Power BI, SSIS, and Snowflake. Your primary responsibilities will include the analysis, design, and development of online dashboards, visualizations, and offline reports in an agile environment. You will be involved in the full software development life cycle from conception to deployment. Your mandatory skills should include proficiency in reporting tools such as Sigma Computing, Power BI, Microsoft SQL Server, as well as Microservices and Event-driven architecture using C#/.NET. It is essential to have a strong familiarity with Artificial Intelligence (AI) and GenAI tools for development acceleration. Additionally, you must possess solid experience in Data Modeling and Data Engineering using Snowflake. Having knowledge of Agile methodologies and Gen AI would be considered as nice-to-have skills for this role. The work model for this position is hybrid, and the location of work is in Bangalore. The ideal candidate should have 7-10 years of relevant experience and be available to join on immediate notice or within 15 days. The interview process for this position will be conducted virtually. If you find this opportunity aligning with your expertise and interests, please share your resume with netra.s@twsol.com.,

Posted 3 days ago

Apply

5.0 - 9.0 years

0 Lacs

hyderabad, telangana

On-site

As a Data Engineering Lead, you will collaborate with marketing, analytics, and business teams to understand data requirements and develop data solutions that address critical business inquiries. Your responsibilities will include leading the implementation and strategic optimization of tag management solutions such as Tealium and Google Tag Manager (GTM) to ensure precise and comprehensive data capture. You will leverage your expertise in Google Analytics 4 (GA4) to configure and customize data collection processes for enhanced insights. Additionally, you will architect scalable and performant data models on Google Cloud, utilizing BigQuery for data warehousing and analysis purposes. In this role, you will proficiently use SQL and scripting languages like JavaScript and HTML for data extraction, manipulation, and visualization. You will also play a pivotal role in mentoring and guiding a team of engineers, fostering a culture of collaboration and continuous improvement. Staying updated on the latest trends and technologies in data engineering and analytics, you will bring innovative ideas to the table and drive the deliverables by mentoring team members effectively. To qualify for this position, you must have experience with Tealium and tag management tools, along with a proven ability to use communication effectively to build positive relationships and drive project success. Your expertise in tag management solutions such as Tealium and GTM will be crucial for comprehensive website and app data tracking, including the implementation of scripting languages for Tag Extensions. Proficiency in Tealium concepts like IQ Tag Management, Audience Stream, Event Stream API Hub, Customer Data Hub, and Debugging tools is essential. Experience in utilizing Google Analytics 4 (GA4) for advanced data collection and analysis, as well as knowledge of Google Cloud, particularly Google BigQuery for data warehousing and analysis, will be advantageous. Preferred qualifications for this role include experience in a similar industry (e.g., retail, e-commerce, digital marketing), proficiency with Python/PySpark for data processing and analysis, working knowledge of Snowflake for data warehousing, experience with Airflow or similar workflow orchestration tools for managing data pipelines, and familiarity with AWS Cloud Technology. Additionally, skills in frontend technologies like React, JavaScript, and HTML, coupled with Python expertise for backend development, will be beneficial. Overall, as a Data Engineering Lead, you will play a critical role in designing robust data pipelines and architectures that support data-driven decision-making for websites and mobile applications, ensuring seamless data orchestration and processing through best-in-class ETL tools and technologies. Your expertise in Tealium, Google Analytics 4, and SQL will be instrumental in driving the success of data engineering initiatives within the organization.,

Posted 3 days ago

Apply

10.0 - 14.0 years

0 Lacs

pune, maharashtra

On-site

About the Team As a part of the DoorDash organization, you will be joining a data-driven team that values timely, accurate, and reliable data to make informed business and product decisions. Data serves as the foundation of DoorDash's success, and the Data Engineering team is responsible for building database solutions tailored to various use cases such as reporting, product analytics, marketing optimization, and financial reporting. By implementing robust data structures and data warehouse architecture, this team plays a crucial role in facilitating decision-making processes at DoorDash. Additionally, the team focuses on enhancing the developer experience by developing tools that support the organization's high-velocity demands. About the Role DoorDash is seeking a dedicated Data Engineering Manager to lead the development of enterprise-scale data solutions. In this role, you will serve as a technical expert on all aspects of data architecture, empowering data engineers, data scientists, and DoorDash partners. Your responsibilities will include fostering a culture of engineering excellence, enabling engineers to deliver reliable and flexible solutions at scale. Furthermore, you will be instrumental in building and nurturing a high-performing team, driving innovation and success in a dynamic and fast-paced environment. In this role, you will: - Lead and manage a team of data engineers, focusing on hiring, building, growing, and nurturing impactful business-focused data teams. - Drive the technical and strategic vision for embedded pods and foundational enablers to meet current and future scalability and interoperability needs. - Strive for continuous improvement of data architecture and development processes. - Balance quick wins with long-term strategy and engineering excellence, breaking down large systems into user-friendly data assets and reusable components. - Collaborate cross-functionally with stakeholders, external partners, and peer data leaders. - Utilize effective planning and execution tools to ensure short-term and long-term team and stakeholder success. - Prioritize reliability and quality as essential components of data solutions. Qualifications: - Bachelor's, Master's, or Ph.D. in Computer Science or equivalent field. - Over 10 years of experience in data engineering, data platform, or related domains. - Minimum of 2 years of hands-on management experience. - Strong communication and leadership skills, with a track record of hiring and growing teams in a fast-paced environment. - Proficiency in programming languages such as Python, Kotlin, and SQL. - Prior experience with technologies like Snowflake, Databricks, Spark, Trino, and Pinot. - Familiarity with the AWS ecosystem and large-scale batch/real-time ETL orchestration using tools like Airflow, Kafka, and Spark Streaming. - Knowledge of data lake file formats including Delta Lake, Apache Iceberg, Glue Catalog, and S3. - Proficiency in system design and experience with AI solutions in the data space. At DoorDash, we are dedicated to fostering a diverse and inclusive community within our company and beyond. We believe that innovation thrives in an environment where individuals from diverse backgrounds, experiences, and perspectives come together. We are committed to providing equal opportunities for all and creating an inclusive workplace where everyone can excel and contribute to our collective success.,

Posted 3 days ago

Apply

5.0 - 9.0 years

0 Lacs

hyderabad, telangana

On-site

As a Cloud Data Ops Engineer specializing in AWS & Snowflake at EY, you will play a vital role in the Cloud Data Ops team focusing on Snowflake and BI tools for Cloud data platform. Your responsibilities will include providing operational support, collaborating with application development teams, infrastructure teams, and vendors to address technical support incidents efficiently. Your primary responsibilities will involve working with Snowflake, Linux, Python, ETL, and Scheduler tools to ensure smooth operations. You will need to maintain a comprehensive understanding of Snowflake service offerings and possess experience in cloud data platforms, compute, storage, and network architecture. Additionally, you will be expected to resolve critical production issues, participate in Major Incident Management calls, and support Disaster Recovery events. Communication skills are key for this role as you will be required to work closely with various teams, vendors, and management. Upholding a culture of honesty and transparency is essential while championing effective verbal and written communication. Leadership competencies at this level include accountability, global collaboration, effective communication, influencing, innovation, and creativity. You should be capable of thinking innovatively, generating new ideas, and confidently pursuing challenges to identify new opportunities. In terms of technical qualifications, you should have a minimum of 5 years of experience in IT with a focus on cloud platforms. Practical experience with Snowflake, AWS, Linux, Python, SQL, ETL, Scheduler tools, and automation tools like Terraform is necessary. Familiarity with BI tools such as PowerBI, Sigma, Looker, enterprise-level applications, alerting, monitoring, dashboarding, and escalation is also required. A Bachelor's Degree in computer science or engineering is preferred, and Snowflake Snowpro certification would be advantageous. Join EY to cultivate a career tailored to your unique talents and contribute to building a better working world for all. EY's global scale, inclusive culture, and cutting-edge technology provide the support you need to excel. Your voice and perspective will play a crucial role in shaping EY's future success. Embrace this opportunity to create an exceptional experience for yourself while positively impacting the world around you.,

Posted 3 days ago

Apply

2.0 - 6.0 years

0 Lacs

haryana

On-site

As an ETL Developer, you should have a minimum of 4+ years of experience with specific expertise in Snowflake and SQL. You will be working in locations such as Pune and Gurgaon. Your responsibilities will include: - Utilizing your deep technical knowledge to create and implement effective solutions that align with client objectives. - Demonstrating proficiency in platforms like Snowflake, AWS, and SQL to provide innovative solutions. - Applying your expertise in data modeling and warehousing concepts to solve project-based technical challenges. - Ensuring the quality of deliverables by adhering to architecture and design guidelines, coding best practices, and participating in design/code reviews. - Hands-on experience in developing and automating data pipelines, orchestration, and ingestion processes. - Collaborating closely with project managers and leads to understand their requirements, quality metrics, IT standards, and deliver outcomes that meet client expectations. - Effectively communicating technical information with stakeholders to ensure project success.,

Posted 3 days ago

Apply

10.0 - 14.0 years

0 Lacs

coimbatore, tamil nadu

On-site

As a Data Engineering Lead/Architect with over 10 years of experience, you will play a crucial role in architecting and designing data solutions that meet business requirements efficiently. Collaborating with cross-functional teams, you will define data architectures, models, and integration strategies to ensure the successful implementation of data pipelines, ETL processes, and data warehousing solutions. Your expertise in Snowflake technologies will be essential in building and optimizing data warehouses. You will develop and maintain Snowflake data models and schemas, following best practices such as cost analysis, resource allocation, and security configurations to support reporting and analytics needs effectively. Utilizing Azure cloud services and Databricks platforms, you will manage and process large datasets efficiently. Your responsibilities will include building, deploying, and maintaining data pipelines on Azure Data Factory, Azure Databricks, and other Azure services. Implementing best practices for data warehousing, ensuring data quality, consistency, and reliability will be a key focus area. You will also create and manage data integration processes, including real-time and batch data movement between systems. Your mastery in SQL and PL/SQL will be vital in writing complex queries to extract, transform, and load data effectively. You will optimize SQL queries and database performance for high-volume data processing to ensure seamless operations. Continuously monitoring and enhancing the performance of data pipelines and storage systems will be part of your responsibilities. You will troubleshoot and resolve data-related issues promptly to minimize downtime and maintain data availability. Documenting data engineering processes, data flows, and architectural decisions will be crucial for effective collaboration with data scientists, analysts, and stakeholders. Additionally, implementing data security measures and adhering to compliance standards like GDPR and HIPAA will be essential to protect sensitive data. In addition to your technical skills, you are expected to showcase leadership abilities by driving data engineering strategies, engaging in sales and proposal activities, developing strong customer relationships, and mentoring other team members. Your experience with cloud-based data solution architectures, client engagement, and leading technical teams will be valuable assets in this role. To qualify for this position, you should hold a bachelor's or master's degree in computer science or a related field. You must have over 10 years of experience in Data Engineering, with a strong focus on architecture. Proven expertise in Snowflake, Azure, and Databricks technologies, along with comprehensive knowledge of data warehousing concepts, ETL processes, and data integration techniques, is required. Exceptional SQL and PL/SQL skills, experience with performance tuning, and strong problem-solving abilities are essential. Excellent communication skills and relevant certifications in technologies like Snowflake and Azure will be advantageous.,

Posted 3 days ago

Apply

3.0 - 7.0 years

0 Lacs

hyderabad, telangana

On-site

The Data Engineer plays a critical role in the organization by designing, building, and maintaining scalable data pipelines and infrastructure. Collaborating closely with cross-functional teams, you ensure the smooth flow of data and enhance data-driven decision-making. Your key responsibilities include designing, developing, and maintaining data pipelines and ETL processes using tools such as Snowflake, Azure, AWS, Data Bricks, Informatica, and DataStage. You will work with data scientists and stakeholders to understand data requirements, ensuring data availability and integrity. Additionally, optimizing and tuning the performance of data infrastructure and processing systems, implementing data security and privacy measures, troubleshooting and performance tuning of ETL processes, and developing documentation for data infrastructure and processes are crucial aspects of your role. You will also participate in the evaluation and selection of new technologies and tools to enhance data engineering capabilities, provide support and mentorship to junior data engineers, adhere to best practices in data engineering, and maintain high standards of quality. Collaboration with cross-functional teams to support data-related initiatives and projects is essential for success in this role. To qualify for this position, you should possess a Bachelor's or Master's degree in Computer Science, Engineering, or a related field, along with proven experience in data engineering, ETL development, and data warehousing. Proficiency in Snowflake, AWS, Azure, Data Bricks, Informatica, and DataStage, strong programming skills in languages like Python, SQL, or Java, experience with big data technologies and distributed computing, and knowledge of data modeling and database design principles are required. Your ability to work with stakeholders, understanding data requirements, translating them into technical solutions, and knowledge of data governance, data quality, and data integration best practices are critical. Experience with cloud data platforms and services, excellent problem-solving and analytical abilities, strong communication and collaboration skills, and the ability to thrive in a fast-paced and dynamic environment are essential for success. Relevant certifications in cloud platforms and data engineering, such as AWS Certified Big Data - Specialty, Microsoft Certified: Azure Data Engineer, and SnowPro Core Certification, will be advantageous. In summary, as a Data Engineer, you will play a vital role in designing, building, and maintaining data pipelines and infrastructure, collaborating with cross-functional teams, optimizing data performance, and ensuring data security and privacy to support data-driven decision-making and initiatives effectively.,

Posted 3 days ago

Apply
cta

Start Your Job Search Today

Browse through a variety of job opportunities tailored to your skills and preferences. Filter by location, experience, salary, and more to find your perfect fit.

Job Application AI Bot

Job Application AI Bot

Apply to 20+ Portals in one click

Download Now

Download the Mobile App

Instantly access job listings, apply easily, and track applications.

Featured Companies