Jobs
Interviews

96 Fivetran Jobs - Page 2

Setup a job Alert
JobPe aggregates results for easy application access, but you actually apply on the job portal directly.

10.0 - 12.0 years

0 Lacs

bengaluru, karnataka, india

On-site

Job Title: Director- Data Engineering GCL- F Introduction To Role Are you ready to lead the charge in redefining healthcare data into actionable insights Alexion is seeking a dynamic and visionary data leader to architect and design innovative data integration solutions that drive innovation across the company. As Director of Data Engineering, you&aposll play a pivotal role in shaping the future of patient-centric data platforms, enabling better decision-making and delivering impactful results. Your expertise will empower multi-functional teams, support global business strategies, and contribute to the development of groundbreaking data pipelines. If you&aposre passionate about demonstrating data to drive business innovation and have a proven track record in the Pharma/Biopharma industry, we want you on our team! Accountabilities Proactively find opportunities and threats using data platforms and integration to enhance decision-making at Alexion. Drive multi-functional collaboration to provide strategic advice on key business questions. Educate executive, business, and IT teams on the value of data management platforms. Collaborate with leadership and IT to support strategies that optimize Alexion&aposs business growth. Work with multi-functional teams to structure problems, extract data, and develop integrated information solutions. Partner with authorities to advance data management and modeling for self-service business analysis. Provide technical leadership throughout project phases from discovery to delivery. Evolve data platforms, tools, and methods for continuous improvement. Deliver high-quality solutions and responses to ad-hoc data requests. Essential Skills/Experience A Master&aposs Degree in Computer Science, Information Systems, Engineering, Business, or related scientific/technical field preferred. Minimum of 10 years of experience in data engineering, business analysis, and data management. Exceptional verbal and written communication skills; ability to convey analytical insights in actionable business terms. Highly motivated self-starter with confidence to present complex information effectively to all audiences. Strong analytical, logical thinking, and organizational skills; capable of managing multiple projects simultaneously. Ability to anticipate future business trends and integrate them into IT and business practices. Proven track record of effective functional and multi-functional collaboration and leadership. Diligent self-starter; able to work independently and in a team environment. Desire and ability to learn/implement new tools and analytic capabilities. Experience designing methods, processes, and systems for consolidating and analyzing structured/unstructured data from diverse sources. Experience developing advanced software applications, algorithms, querying, and automated processes for data evaluation. Proven ability to design complex, large-scale data solutions that are scalable, robust, secure, and resilient. Pharmaceutical or Life Sciences industry experience a plus. Experience using dbT, Fivetran, GitHub, Apache Airflow. Extensive hands-on experience with SQL, Python, ETL/ELT frameworks, and data orchestration pipelines. AWS Architecture Framework knowledge and certification. Expertise in Snowflake concepts like resource monitors, RBAC controls, virtual warehouse sizing, query performance tuning, Zero copy clone, data sharing, time travel, SnowSQL, SnowPipe, Streamlit, Cortex. Experience in data quality and observability tools/methodologies. Understanding of FAIR and TRUSTed data product principles. Knowledge of data governance frameworks/compliance standards relevant to life sciences industry (GDPR/HIPAA). Experience with ETL/ELT/Data Loading tools using Apache Airflow, AWS Glue with Python. Experience bringing to bear AI technologies for ELT processes and automating self-healing data pipelines. Experience working with data science operations teams using serverless architectures, Kubernetes, Docker/containerization. Solid understanding of analytic data architecture/data modeling concepts/principles (data lakes/warehouses/marts). Data warehousing methodologies/modeling techniques (Kimball/3NF/Star Schema). Desirable Skills/Experience Prior experience of 10+ years as a Data Platform or Technical Leader in biotech/pharma industry. Advanced experience with cloud platforms beyond AWS (Azure/Google Cloud/Databricks) for data engineering/storage solutions. When we put unexpected teams in the same room, we unleash bold thinking with the power to inspire life-changing medicines. In-person working gives us the platform we need to connect, work at pace and challenge perceptions. That&aposs why we work, on average, a minimum of three days per week from the office. But that doesn&apost mean we&aposre not flexible. We balance the expectation of being in the office while respecting individual flexibility. Join us in our unique and ambitious world. AstraZeneca offers an environment where you can make a significant impact by connecting across the business to influence patient outcomes positively. Here you&aposll collaborate with leading experts using innovative techniques to drive disruptive transformation as we become a digital and data-led enterprise. Our inclusive team thrives on diversity and innovation, empowering you to expand your knowledge while delivering greater value for patients every day. Ready to take on this exciting challenge Apply now to join our team! Date Posted 21-Aug-2025 Closing Date 07-Sept-2025 Alexion is proud to be an Equal Employment Opportunity and Affirmative Action employer. We are committed to fostering a culture of belonging where every single person can belong because of their uniqueness. The Company will not make decisions about employment, training, compensation, promotion, and other terms and conditions of employment based on race, color, religion, creed or lack thereof, sex, sexual orientation, age, ancestry, national origin, ethnicity, citizenship status, marital status, pregnancy, (including childbirth, breastfeeding, or related medical conditions), parental status (including adoption or surrogacy), military status, protected veteran status, disability, medical condition, gender identity or expression, genetic information, mental illness or other characteristics protected by law. Alexion provides reasonable accommodations to meet the needs of candidates and employees. To begin an interactive dialogue with Alexion regarding an accommodation, please contact [HIDDEN TEXT]. Alexion participates in E-Verify. Show more Show less

Posted 3 weeks ago

Apply

10.0 - 14.0 years

10 - 20 Lacs

hyderabad, bengaluru, delhi / ncr

Work from Office

We are seeking a Sr. Data Engineer to join our Data Engineering team within our Enterprise Data Insights organization to build data solutions, design and implement ETL/ELT processes and manage our data platform to enable our cross functional stakeholders. As a part of our Corporate Engineering division, our vision is to spearhead technology and data-led solutions and experiences to drive growth & innovation at scale. The ideal candidate will have a strong Data Engineering background, advanced Python knowledge and experience with cloud services and SQL/NoSQL databases. You will work closely with our cross functional stakeholders in Product, Finance and GTM along with Business and Enterprise Technology teams. As a Senior Data Engineer, you will: Collaborating closely with various stakeholders to prioritize requests, identify improvements, and offer recommendations. Taking the lead in analyzing, designing, and implementing data solutions, which involves constructing and designing data models and ETL processes. Cultivating collaboration with corporate engineering, product teams, and other engineering groups. Leading and mentoring engineering discussions, advocating for best practices. Actively participating in design and code reviews. Accessing and exploring third-party data APIs to determine the data required to meet business needs. Ensuring data quality and integrity across different sources and systems. Managing data pipelines for both analytics and operational purposes. Continuously enhancing processes and policies to improve SLA and SOX compliance. You'll be a great addition to the team if you have: Hold a B.S., M.S., or Ph.D. in Computer Science or a related technical field. Possess over 5 years of experience in Data Engineering, focusing on building and maintaining data environments. Demonstrate at least 5 years of experience in designing and constructing ETL/ELT processes, managing data solutions within an SLA-driven environment. Exhibit a strong background in developing data products, APIs, and maintaining testing, monitoring, isolation, and SLA processes. Possess advanced knowledge of SQL/NoSQL databases (such as Snowflake, Redshift, MongoDB). Proficient in programming with Python or other scripting languages. Have familiarity with columnar OLAP databases and data modeling. Experience in building ELT/ETL processes using tools like dbt, AirFlow, Fivetran, CI/CD using GitHub, and reporting in Tableau. Possess excellent communication and interpersonal skills to effectively collaborate with various business stakeholders and translate requiremaob Title: Senior Software Engineer Full Stack Work Mode: Remote Timings: 11 AM 8 PM IST Location: Remote- Bengaluru,Hyderabad,Delhi / NCR,Chennai,Pune,Kolkata,Ahmedabad,Mumbai

Posted 3 weeks ago

Apply

5.0 - 10.0 years

10 - 15 Lacs

pune, chennai, bengaluru

Hybrid

Job Opportunity with Hexaware Technologies ! We are hiring Snowflake _ Fivetran developer (4 to 10 years), interested pls share below details to manojkumark2@hexaware.com Total IT Exp: Exp in Snowflake: Exp in Fivetran: CCTC & ECTC: Current Company & Location: NP /LWD:

Posted 3 weeks ago

Apply

2.0 - 6.0 years

0 Lacs

pune, maharashtra

On-site

The Data Engineer position at Toro Technology Center India involves playing a crucial role in facilitating the capture, ingestion, and transformation of data to provide data assets for driving business decisions. By collaborating with cross-functional teams, designing and maintaining data pipelines, and ensuring the accuracy and accessibility of data, you will contribute to the success and growth of The Toro Company. To excel in this role, you should possess a Bachelor's degree in Computer Science, Information Technology, or a related field, along with at least 2 years of experience in data engineering. Proficiency in SQL, programming languages such as Python or JavaScript, and experience with cloud-based data platforms, particularly Microsoft Azure, are essential requirements for this position. Key Responsibilities: - Collaborate with cross-functional teams to identify and implement data-driven solutions. - Design, develop, and maintain data pipelines using Fivetran and dbt. - Implement and maintain the Snowflake data warehouse. - Work closely with data science and analytics teams to ensure data accuracy, reliability, and accessibility while adhering to security protocols. - Create and maintain documentation of data pipeline processes and procedures. The Toro Company values its employees and offers a competitive salary, top-tier medical insurance, and a range of benefits including: - Dress for your day policy to enhance productivity in a comfortable work environment. - Conveniently located in Baner, Pune for easy commuting. - Onsite cafeteria serving breakfast and lunch along with snacks and coffee. - Complimentary use of the fitness facility and access to mental health and financial resources. - 20 hours of paid time off for volunteering in the community. - Flexible work arrangements with a hybrid schedule promoting team-building and flexibility. Join us at The Toro Company and be a part of a company with a global footprint, a passion for helping people beautify landscapes, and a commitment to employee well-being and growth.,

Posted 1 month ago

Apply

5.0 - 9.0 years

0 Lacs

karnataka

On-site

Emerson Test & Measurement Group (NI, now known as Emerson Test & Measurement) Insights and Analytics team is seeking highly skilled and experienced Sr. Strategic Consultants to join our dynamic and growing team. The ideal candidates should have a strong background in data modeling, business analytics, and a passion for solving complex data problems. As a Sr. Strategic Consultant, you will be responsible for designing, building, and maintaining scalable data pipelines, analytics solutions, and visualizations to support our business objectives. This role involves close collaboration with the leadership team in a highly matrixed environment, directly supporting the company's Software Sales Business leadership teams, key initiatives, and performing analytics around performance, customer and contact base, product portfolio, financials, and markets to support NIs Business Units. Additionally, you will work closely with key functional areas including Sales, Marketing, Product Management, Finance, and R&D. Your Responsibilities Will Include: - Designing, developing, and maintaining scalable data pipelines and ETL processes with DBT and Power BI. - Collaborating with analysts, data scientists, and other team members to understand data requirements and deliver high-quality data solutions. - Implementing data integration, transformation, and validation processes. - Optimizing and tuning data pipelines for cost, performance, and reliability. - Developing and maintaining data models, schemas, and documentation. - Ensuring data quality, integrity, and security across all data systems. - Monitoring and resolving data pipeline issues and implementing solutions. - Staying up to date with the latest industry trends and technologies in data engineering and analytics. - Building effective partnerships with business collaborators and team members. - Developing a deep understanding of business strategies, processes, and operations and the resulting data. - Applying data and advanced analytics capabilities to provide insights that solve key business problems and drive alignment between functions, collaborators, and information producers. - Developing scalable data, reporting, and analytics solutions. - Extracting, transforming, and analyzing large sets of data. - Gathering and effectively communicating insights. - Driving positive business impact through insights. Who You Are: You are a self-driven leader who shows initiative and handles ambiguity gracefully. You possess strong leadership skills and have a consistent track record of developing and implementing strategies to achieve organizational objectives. Requirements for the Role: - Bachelor's or master's degree in Computer Science, Engineering, or a related field. - 5-7 years of experience in data engineering, analytics, or a related role. - Good experience with analytical SQL techniques. - Experience building data products following data modeling concepts such as Kimball, Data Mesh, or Data Vault. - Strong problem-solving skills and attention to detail. - Good communication and collaboration skills. - Proficiency in programming languages like Python and R. - Familiarity with traditional database platforms (MySQL, SQL Server, Oracle), cloud platforms (AWS, Azure, Google Cloud), data warehousing and pipelines (Hive, Spark, Snowflake, DBT, Airflow), data ingestion and integration (Informatica, Fivetran, Mulesoft), data visualization and dashboarding (Power BI, Tableau), and CRM and ERP data (Salesforce, Oracle EBusiness Suite). Our Commitment to You: At Emerson, we prioritize a workplace where every employee is valued, respected, and empowered to grow. We foster an environment that encourages innovation, collaboration, and diverse perspectives because we believe that great ideas come from great teams. Our commitment to ongoing career development and growing an inclusive culture ensures you have the support to thrive. We invest in your success through mentorship, training, and leadership opportunities to make a lasting impact. We believe that diverse teams working together are key to driving growth and delivering business results. Employee wellbeing is important to us, and we provide competitive benefits plans, various medical insurance options, an Employee Assistance Program, employee resource groups, recognition, and more. Our culture offers flexible time off plans, including paid parental leave (maternal and paternal), vacation, and holiday leave.,

Posted 1 month ago

Apply

3.0 - 7.0 years

0 Lacs

pune, maharashtra

On-site

We believe that great work takes time, and so does a great job application. The deadline to apply for this role is Friday, 22nd August. We will begin reviewing applications after this deadline, so there's no rush to be the first to apply. In order to help you put your best foot forward, we look for the following in your application: - Clear, specific examples of your work and its impact - Relevant experience aligned with the focus of this role - A thoughtful response to any application questions - A resume that highlights both what you did and why it mattered If you are unsure whether your background is a perfect fit, we still encourage you to apply as we value potential and growth just as much as experience. This role exists because our Service Excellence team is expanding to meet growing client demand. We are seeking talented Data Platform Consultants who can ensure smooth operations, resolve issues, and proactively optimize performance across modern data platforms. This role is part of our service delivery function, which goes beyond traditional support. You will be assisting clients in maintaining and enhancing their data platforms post go-live, ensuring they run smoothly. Your responsibilities in this role will include: - Monitoring applications and resolving issues through debugging and change requests - Designing small-scale systems and optimizing performance - Using tools like SQL, Matillion, Snowflake, and ServiceNow to manage tasks and SLAs - Collaborating with delivery teams during transition phases and knowledge transfer - Contributing to documentation and knowledge sharing to ensure continuity and team learning In the first 36 months, success in this role might look like: - Mastering our technology stack; Snowflake and Matillion - Responding to customer issues proactively and meeting SLAs with minimal supervision - Identifying and resolving recurring issues and contributing to documentation and knowledge sharing To succeed in this role, we are looking for someone who brings technical curiosity, reliability, and a proactive mindset. You would be a great fit if you have experience with ETL tools (Matillion preferred), a strong understanding of data modeling (Kimball methodology) and data warehousing concepts, familiarity with ticketing tools like ServiceNow, Jira, or similar, analytical thinking, dedication, and a willingness to learn and ask questions. This role might not be the right fit if you are looking for a purely development-focused role without operational responsibilities or a rigid working schedule; adaptability is key (CET hours with flexibility for early monitoring). Snap Analytics is a high-growth data analytics consultancy specializing in cutting-edge cloud analytics solutions. We partner with enterprise businesses worldwide to modernize their data platforms, enabling smarter decision-making through technologies like Snowflake, Databricks, Matillion, and others. Our culture is built on collaboration, continuous learning, and pushing boundaries. Join us and be part of a team that's shaping the future of data analytics!,

Posted 1 month ago

Apply

5.0 - 9.0 years

0 Lacs

ahmedabad, gujarat

On-site

As a Data Engineer at our esteemed organization, you will be responsible for designing, building, and maintaining scalable data pipelines on Snowflake. Your expertise should include experience or knowledge in Snow pipe, Time Travel, and Fail Safe functionalities. Writing and optimizing SQL queries for efficient data extraction and transformation will be a key aspect of your role. You will also be expected to develop ETL processes to seamlessly integrate various data sources into Snowflake and monitor and troubleshoot data warehouse performance issues. Implementing security measures and data governance practices will be crucial, along with possessing a deep understanding of Snowflake architecture. Collaboration with cross-functional teams to support analytical and reporting needs will be an integral part of your responsibilities. It would be advantageous to have knowledge of Fivetran. With 5 to 8 years of experience, you should hold a Bachelor's degree in computer science, Information Technology, or a related field. Demonstrated proficiency in Snowflake and data warehousing concepts, as well as expertise in SQL and ETL tools such as Talend or Informatica, is essential. Joining our team means becoming a part of one of the top-ranked IT companies in Ahmedabad, Gujarat. We are proud to be ISO 9001:2015 & ISO 27001:2013 certified, leading global technology solution providers with a strong focus on the USA, Middle East, and Canada. Our services encompass custom software development, Enterprise Mobility Solutions, and the Internet of Things. Our diverse team of passionate and experienced professionals consistently strives to exceed customer expectations by implementing industry best practices. At Stridely Solutions, you will have the opportunity to work on international enterprise-level projects of significant size, interact with US customers, and benefit from an Employee-First approach. We prioritize continuous learning, training, and knowledge enhancement for all our employees. Our organizational culture is characterized by self-development, career growth opportunities, democratic values, and strong leadership. Recognizing your potential, we offer overseas visits, transfers, and exposure to broaden your horizons. If you are looking for a dynamic work environment that encourages innovation and growth, Stridely Solutions is the place for you. Visit our website at www.stridelysolutions.com to learn more about us. Join our team of 500+ employees and enjoy a 5-day workweek in locations such as Ahmedabad, Pune, or Vadodara.,

Posted 1 month ago

Apply

3.0 - 6.0 years

0 Lacs

Jaipur, Rajasthan, India

On-site

Unlock yourself. Take your career to the next level. At Atrium, we live and deliver at the intersection of industry strategy, intelligent platforms, and data science empowering our customers to maximize the power of their data to solve their most complex challenges. We have a unique understanding of the role data plays in the world today and serve as market leaders in intelligent solutions. Our data-driven, industry-specific approach to business transformation for our customers places us uniquely in the market. Who are you You are smart, collaborative and take ownership to get things done. You love to learn and are intellectually curious in business and technology tools, platforms and languages. You are energized by solving complex problems and bored when you dont have something to do. You love working in teams, and are passionate about pulling your weight to make sure the team succeeds. What will you be doing at Atrium In this role, you will join the best and brightest in the industry to skillfully push the boundaries of whats possible. You will work with customers to make smarter decisions through innovative problem-solving using data engineering, Analytics, and systems of intelligence. You will partner to advise, implement, and optimize solutions through industry expertise, leading cloud platforms, and data engineering. As a Senior Data Engineering Consultant, you will be responsible for expanding and optimizing the data and data pipeline architecture, as well as optimizing data flow and collection for cross-functional teams. You will support the software developers, database architects, data analysts, and data scientists on data initiatives and will ensure optimal data delivery architecture is consistent throughout ongoing projects. The Senior Data Engineering Consultant Will Create and maintain optimal data pipeline architecture Assemble large, complex data sets that meet functional / non-functional business requirements Identify, design, and implement internal process improvements: automating manual processes, optimizing data delivery, re-designing infrastructure for greater scalability, etc. Build the infrastructure required for optimal extraction, transformation, and loading of data from a wide variety of data sources using SQL, AWS, and Big Data technologies Development of ETL processes to ensure timely delivery of required data for customers Implementation of Data Quality measures to ensure accuracy, consistency, and integrity of data Design, implement, and maintain data models that can support the organization&aposs data storage and analysis needs Deliver technical and functional specifications to support data governance and knowledge sharing In This Role, You Will Have B.Tech degree in Computer Science, Software Engineering, or equivalent combination of relevant work experience and education 3-6 years of experience delivering consulting services to medium and large enterprises. Implementations must have included a combination of the following experiences: Data Warehousing or Big Data consulting for mid-to-large-sized organizations. Strong analytical skills with a thorough understanding of how to interpret customer business needs and translate those into a data architecture Strong experience with Snowflake and Data Warehouse architecture SnowPro Core certification is highly desired Hands-on experience with Python (Pandas, Dataframes, Functions) Hands-on experience with SQL (Stored Procedures, functions) including debugging, performance optimization, and database design Strong Experience with Apache Airflow and API integrations Solid experience in any one of the ETL tools (Informatica, Talend, SAP BODS, DataStage, Dell Boomi, Mulesoft, FiveTran, Matillion, etc.) Nice to have: Experience in Docker, DBT, data replication tools (SLT, HVR, Qlik, etc), Shell Scripting, Linux commands, AWS S3, or Big data technologies Strong project management, problem-solving, and troubleshooting skills with the ability to exercise mature judgment Enthusiastic, professional, and confident team player with a strong focus on customer success who can present effectively even under adverse conditions Strong presentation and communication skills Next Steps Our recruitment process is highly personalized. Some candidates complete the hiring process in one week, others may take longer as its important we find the right position for you. It&aposs all about timing and can be a journey as we continue to learn about one another. We want to get to know you and encourage you to be selective - after all, deciding to join a company is a big decision! At Atrium, we believe a diverse workforce allows us to match our growth ambitions and drive inclusion across the business. We are an equal opportunity employer and all qualified applicants will receive consideration for employment. Show more Show less

Posted 1 month ago

Apply

5.0 - 9.0 years

0 Lacs

pune, maharashtra

On-site

Dataction is a forward-thinking technology services firm that delivers top-notch information technology, analytics, and consulting services to esteemed international organizations. Since its establishment in 2010, Dataction has experienced significant growth and has earned a reputation for offering innovative and dependable services to a diverse clientele across various industries. At Dataction, we have a unique approach of connecting all the dots and reimagining every business process. Our team adopts a lean, agile, and consultative methodology to tackle challenges and drive execution, enabling our clients to achieve sustainable growth, secure profitability, and ensure a promising future. Our team members are known for their dedication, courage, and willingness to push boundaries, making Dataction an inspiring and dynamic workplace. As a Sr. Engineer Data Support at Dataction, you will play a critical role within our Data Engineering support team. Your primary responsibility will be to ensure the smooth operation, stability, and performance of our data platforms and pipelines by implementing enhancements, addressing issues, and optimizing data operations. The ideal candidate should possess a robust technical background in Snowflake, DBT, SQL, AWS, and various data integration tools, along with exceptional problem-solving skills and effective communication abilities. **Responsibilities:** - Collaborate with Engineering and Business teams to resolve live issues and implement improvements. - Investigate data-related problems, troubleshoot discrepancies, and deliver timely solutions. - Enhance operational efficiency by automating manual processes through scripting and tool integrations. - Conduct regular health checks and proactive monitoring to maintain system stability and performance. - Manage incident response activities, including issue triage, root cause analysis, and resolution coordination. - Keep stakeholders informed about platform status, incidents, and resolutions. - Administer and maintain platforms across AWS, Snowflake, DBT, and related technologies. - Ensure smooth data operations by working with data pipelines and tools like HVR, Stitch, Fivetran, and Terraform. - Continuously enhance monitoring, alerting, and operational workflows to minimize downtime and boost performance. **Qualifications, Skills, And Experience:** - 5+ years of relevant experience in data operations. - Bachelor's degree in Computer Science, Information Systems, or a related field. - Proficiency in AWS, Snowflake, DBT, HVR, Stitch, Fivetran, and Terraform for managing data platforms. - Strong analytical and problem-solving skills for issue diagnosis and resolution. - Experience in incident management, system monitoring, and automation. - Ability to script in SQL and Python for automation and data analysis. - Effective communication with both technical and non-technical stakeholders. If you are looking for a workplace that values fairness, meritocracy, empowerment, and opportunities, Dataction is the perfect fit for you. In addition to a competitive salary, joining Dataction offers: - Excellent work-life balance with a hybrid work arrangement. - Company-funded skill enhancement and training opportunities. - Exciting reward and recognition programs. - Engaging employee engagement initiatives to bond with colleagues. - On-the-job learning exposure through involvement in new product/ideation teams. - Quarterly one-on-one sessions with the CEO for insights on any topic of your choice.,

Posted 1 month ago

Apply

5.0 - 9.0 years

0 Lacs

karnataka

On-site

As a technically proficient Product Owner with expertise in data and analytics platforms, you will play a crucial role in leading the development of scalable and insight-driven data products. Your responsibilities will involve collaborating closely with data engineers, architects, analysts, and business stakeholders to convert raw data into impactful tools and solutions that drive business intelligence, advanced analytics, and operational efficiency. You will be responsible for defining and managing the product roadmap for data-centric platforms, services, and analytics tools. This will include translating business and analytical goals into detailed technical product requirements, user stories, and epics. By owning and prioritizing the product backlog, you will maximize business value and technical scalability while ensuring seamless delivery of high-performance features through collaboration with engineering, analytics, and design teams. In the realm of analytics and data product development, you will lead the creation of dashboards, reporting tools, self-service analytics, and predictive models. Additionally, you will guide the design and implementation of scalable data pipelines, data lakes, and warehouse architectures using tools such as Snowflake, Redshift, Power BI, Tableau, and Looker. Defining key performance indicators (KPIs) and grounding all features in measurable outcomes will also be key aspects of your role. Acting as a liaison between product, engineering, data science, and business teams, you will partner with engineering and data teams on ETL workflows, data modeling, APIs, and system integration. You will drive delivery using Agile methodologies, ensuring feature launches are supported with documentation, training resources, and adoption strategies. Governance, compliance, and scalability will be crucial areas where you will focus, ensuring product compliance with data governance, GDPR, and security best practices. You will promote scalable architecture and engineering best practices through reusable data models and pipelines, advocating for observability, monitoring, and data quality practices. Preferred Technical Environment: - Languages & Tools: SQL (mandatory), Python or R (preferred), Git, JIRA - BI & Analytics Tools: Power BI, Tableau, Looker - Data Infrastructure: Snowflake, Redshift, BigQuery, dbt, Fivetran, Airbyte - Cloud Platforms: AWS, Azure, or GCP - Agile Tooling: JIRA, Confluence, Miro - Version Control & CI/CD: GitHub, GitLab, Jenkins Qualifications: - Minimum 7 years of experience as a Product Owner or Technical Product Manager for analytics or data products - Proven ability to work with cloud-native data platforms and modern data engineering stacks - Strong understanding of data pipelines, data modeling, ETL orchestration, and warehouse design - Hands-on experience with SQL and at least one modern BI platform - Experience in driving measurable business outcomes through data product initiatives This role offers you the opportunity to make a significant impact by leveraging your technical expertise to drive the development of impactful data products that empower businesses to make informed decisions based on data-driven insights.,

Posted 1 month ago

Apply

6.0 - 10.0 years

0 - 0 Lacs

hyderabad, telangana

On-site

You will be joining QTek Digital, a leading data solutions provider known for its expertise in custom data management, data warehouse, and data science solutions. Our team of dedicated data professionals, including data scientists, data analysts, and data engineers, collaborates to address present-day challenges and pave the way for future innovations. At QTek Digital, we value our employees and focus on fostering engagement, empowerment, and continuous growth opportunities. As a BI ETL Engineer at QTek Digital, you will be taking on a full-time remote position. Your primary responsibilities will revolve around tasks such as data modeling, applying analytical skills, implementing data warehouse solutions, and managing Extract, Transform, Load (ETL) processes. This role demands strong problem-solving capabilities and the capacity to work autonomously. To excel in this role, you should ideally possess: - 6-9 years of hands-on experience in ETL and ELT pipeline development using tools like Pentaho, SSIS, FiveTran, Airbyte, or similar platforms. - 6-8 years of practical experience in SQL and other data manipulation languages. - Proficiency in Data Modeling, Dashboard creation, and Analytics. - Sound knowledge of data warehousing principles, particularly Kimball design. - Bonus points for familiarity with Pentaho and Airbyte administration. - Demonstrated expertise in Data Modeling, Dashboard design, Analytics, Data Warehousing, and ETL procedures. - Strong troubleshooting and problem-solving skills. - Effective communication and collaboration abilities. - Capability to operate both independently and as part of a team. - A Bachelor's degree in Computer Science, Information Systems, or a related field. This position is based in our Hyderabad office, offering an attractive compensation package ranging from INR 5-19 Lakhs, depending on various factors such as your skills and prior experience. Join us at QTek Digital and be part of a dynamic team dedicated to shaping the future of data solutions.,

Posted 1 month ago

Apply

6.0 - 10.0 years

0 Lacs

karnataka

On-site

Job Description: As a Snowflake Admin with 6+ years of experience and the ability to join immediately, you will be responsible for administering and managing Snowflake environments. This includes configuring, ensuring security, and conducting maintenance tasks. Your role will involve monitoring and optimizing Snowflake performance, storage usage, and query efficiency to enhance overall system functionality. In this position, you will be required to implement and manage role-based access control (RBAC) and data security policies to safeguard sensitive information. Additionally, you will set up and oversee data sharing, data replication, and virtual warehouses to support various data operations effectively. You will be expected to automate administrative tasks using SQL, Snowflake CLI, or scripting languages such as Python and Bash. Your proficiency in these tools will be essential in streamlining processes and improving efficiency within the Snowflake environment. Furthermore, providing support for data integration tools and pipelines like Fivetran, dbt, Informatica, and Airflow will be part of your responsibilities. Key Skills: - Snowflake Admin Industry Type: IT/ Computers - Software Functional Area: Not specified Required Education: Bachelor Employment Type: Full Time, Permanent If you are looking for a dynamic opportunity to utilize your expertise in Snowflake administration, apply now with Job Code: GO/JC/668/2025. Join our team and work alongside our Recruiter, Christopher, in a contract hiring role.,

Posted 1 month ago

Apply

10.0 - 14.0 years

0 Lacs

delhi

On-site

As a Partner Solution Engineer at Snowflake, you will play a crucial role in technically onboarding and enabling partners to re-platform their Data and AI applications onto the Snowflake AI Data Cloud. Collaborating with partners to develop Snowflake solutions in customer engagements, you will work with them to create assets and demos, build hands-on POCs, and pitch Snowflake solutions. Additionally, you will assist Solution Providers/Practice Leads with the technical strategies that enable them to sell their offerings on Snowflake. Your responsibilities will include keeping partners up to date on key Snowflake product updates and future roadmaps to help them represent Snowflake to their clients about the latest technology solutions and benefits. Running technical enablement programs to provide best practices and solution design workshops to help partners create effective solutions will also be part of your role. Success in this position will require you to drive strategic engagements by quickly grasping new concepts and articulating their business value. You will showcase the impact of Snowflake through compelling customer success stories and case studies, demonstrating a strong understanding of how partners make revenue through the industry priorities and complexities they face. Preferred skill sets and experiences for this role include having a total of 10+ years of relevant experience, experience working with Tech Partners, ISVs, and System Integrators (SIs) in India, and developing data domain thought leadership within the partner community. You should also have presales or hands-on experience with Data Warehouse, Data Lake, or Lakehouse platforms, as well as experience with partner integration ecosystems like Alation, FiveTran, Informatica, dbtCloud, etc. Having hands-on experience and strong knowledge of Docker and how to containerize Python-based applications, knowledge of Container networking and Kubernetes, and proficiency in Agile development practices and Continuous Integration/Continuous Deployment (CI/CD), including DataOps and MLops are desirable skills. Experience in the AI/ML domain is a plus. Snowflake is rapidly expanding, and as part of the team, you will help enable and accelerate the company's growth. If you share Snowflake's values, challenge ordinary thinking, and push the pace of innovation while building a future for yourself and Snowflake, this role could be the perfect fit for you. Please visit the Snowflake Careers Site for salary and benefits information if the job is located in the United States.,

Posted 1 month ago

Apply

4.0 - 8.0 years

0 Lacs

navi mumbai, maharashtra

On-site

You will be part of a data analytics services company that specializes in creating and managing scalable data platforms for a diverse client base. Leveraging cutting-edge technologies, you will provide actionable insights and value through modern data stack solutions. Your responsibilities will include designing, building, and managing customer data platforms independently using Snowflake, dbt, Fivetran, and SQL. Collaborating with clients and internal teams to gather business requirements and translating them into reliable data solutions will be a key aspect of your role. You will also develop and maintain ELT pipelines with Fivetran and dbt for automating data ingestion, transformation, and delivery. Optimizing SQL code and data models for scalability, performance, and cost efficiency in Snowflake will be crucial. Additionally, ensuring data platform reliability, monitoring, and data quality maintenance will be part of your responsibilities. You will also provide technical mentorship and guidance to junior engineers and maintain comprehensive documentation of engineering processes and architecture. The required skills and qualifications for this role include proven hands-on experience with Snowflake, dbt, Fivetran, and SQL. You should have a strong understanding of data warehousing concepts, ETL/ELT best practices, and modern data stack architectures. Experience in working independently and owning project deliverables end-to-end is essential. Familiarity with version control systems like Git and workflow automation tools, along with solid communication and documentation skills, is necessary. You should also be able to interact directly with clients and understand their business requirements. Preferred skills that would be beneficial for this role include exposure to cloud platforms like AWS, GCP, and Azure, knowledge of Python or other scripting languages for data pipelines, and experience with BI/analytics tools such as Tableau, Power BI, and Looker. In return, you will have the opportunity to lead the implementation of state-of-the-art data platforms for global clients in a dynamic, growth-oriented work environment with flexible working arrangements and a competitive compensation package. If you are interested in this opportunity, please submit your resume and a short cover letter detailing your experience with Snowflake, dbt, Fivetran, and SQL.,

Posted 1 month ago

Apply

8.0 - 12.0 years

0 Lacs

indore, madhya pradesh

On-site

You are a highly skilled and experienced ETL Developer with expertise in data ingestion and extraction, sought to join our team. With 8-12 years of experience, you specialize in building and managing scalable ETL pipelines, integrating diverse data sources, and optimizing data workflows specifically for Snowflake. Your role will involve collaborating with cross-functional teams to extract, transform, and load large-scale datasets in a cloud-based data ecosystem, ensuring data quality, consistency, and performance. Your responsibilities will include designing and implementing processes to extract data from various sources such as on-premise databases, cloud storage (S3, GCS), APIs, and third-party applications. You will ensure seamless data ingestion into Snowflake, utilizing tools like SnowSQL, COPY INTO commands, Snowpipe, and third-party ETL tools (Matillion, Talend, Fivetran). Developing robust solutions for handling data ingestion challenges such as connectivity issues, schema mismatches, and data format inconsistencies will be a key aspect of your role. Within Snowflake, you will perform complex data transformations using SQL-based ELT methodologies, implement incremental loading strategies, and track data changes using Change Data Capture (CDC) techniques. You will optimize transformation processes for performance and scalability, leveraging Snowflake's native capabilities such as clustering, materialized views, and UDFs. Designing and maintaining ETL pipelines capable of efficiently processing terabytes of data will be part of your responsibilities. You will optimize ETL jobs for performance, parallelism, and data compression, ensuring error logging, retry mechanisms, and real-time monitoring for robust pipeline operation. Your role will also involve implementing mechanisms for data validation, integrity checks, duplicate handling, and consistency verification. Collaborating with stakeholders to ensure adherence to data governance standards and compliance requirements will be essential. You will work closely with data engineers, analysts, and business stakeholders to define requirements and deliver high-quality solutions. Documenting data workflows, technical designs, and operational procedures will also be part of your responsibilities. Your expertise should include 8-12 years of experience in ETL development and data engineering, with significant experience in Snowflake. You should be proficient in tools and technologies such as Snowflake (SnowSQL, COPY INTO, Snowpipe, external tables), ETL Tools (Matillion, Talend, Fivetran), cloud storage (S3, GCS, Azure Blob Storage), databases (Oracle, SQL Server, PostgreSQL, MySQL), and APIs (REST, SOAP for data extraction). Strong SQL skills, performance optimization techniques, data transformation expertise, and soft skills like strong analytical thinking, problem-solving abilities, and excellent communication skills are essential for this role. Location: Bhilai, Indore,

Posted 1 month ago

Apply

0.0 years

0 Lacs

Hyderabad, Telangana, India

On-site

Work Location : Hyderabad What Gramener offers you Gramener will offer you an inviting workplace, talented colleagues from diverse backgrounds, career paths, and steady growth prospects with great scope to innovate. We aim to create an ecosystem of easily configurable data applications focused on storytelling for public and private use. Data Architect We are seeking an experienced Data Architect to design and govern scalable, secure, and efficient data platforms in a data mesh environment. You will lead data architecture initiatives across multiple domains, enabling self-serve data products built on Databricks and AWS, and support both operational and analytical use cases. Key Responsibilities Design and implement enterprise-grade data architectures leveraging the medallion architecture (Bronze, Silver, Gold). Develop and enforce data modelling standards, including flattened data models optimized for analytics. Define and implement MDM strategies (Reltio), data governance frameworks (Collibra), and data classification policies. Lead the development of data landscapes, capturing sources, flows, transformations, and consumption layers. Collaborate with domain teams to ensure consistency across decentralized data products in a data mesh architecture. Guide best practices for ingesting and transforming data using Fivetran, PySpark, SQL, and Delta Live Tables (DLT). Define metadata and data quality standards across domains. Provide architectural oversight for data platform development on Databricks (Lakehouse) and AWS ecosystem. Key Skills & Qualifications Must-Have Technical Skills: (Reltio, Colibra, Ataccama, Immuta) Experience in the Pharma domain. Data Modeling (dimensional, flattened, common data model, canonical, and domain-specific, entity-level data understanding from a business process point of view). Master Data Management (MDM) principles and tools (Reltio) (1). Data Governance and Data Classification frameworks (1). Strong experience with Fivetran**, PySpark, SQL, Python. Deep understanding of Databricks (Delta Lake, Unity Catalog, Workflows, DLT) . Experience with AWS services related to data (e.g., S3, Glue, Redshift, IAM, ). Experience on Snowflake. Architecture & Design Proven expertise in Data Mesh or Domain-Oriented Data Architecture. Experience with medallion/lakehouse architecture. Ability to create data blueprints and landscape maps across complex enterprise systems. Soft Skills Strong stakeholder management across business and technology teams. Ability to translate business requirements into scalable data designs. Excellent communication and documentation skills. Preferred Qualifications Familiarity with regulatory and compliance frameworks (e.g., GxP, HIPAA, GDPR). Background in data product building. About Us We consult and deliver solutions to organizations where data is the core of decision-making. We undertake strategic data consulting for organizations, laying out the roadmap for data-driven decision-making. This helps organizations convert data into a strategic differentiator. Through a host of our products, solutions, and Service Offerings, we analyze and visualize large amounts of data. To know more about us visit Gramener Website and Gramener Blog. Apply for this role Apply for this Role Show more Show less

Posted 1 month ago

Apply

5.0 - 7.0 years

0 Lacs

Mumbai, Maharashtra, India

On-site

curatAId is seeking a Senior Snowflake Consultant on behalf of our client, a fast-growing organization focused on data- driven innovation. This role combines snowflake expertise with DevOps, DBT, Airflow t o support the development and operation of a modern, cloud-based enterprise data platform. The ideal candidate will be responsible for building and managing data infrastructure, developing scalable data pipelines, implementing data quality and governance frameworks and automating workflows for operational efficiency. To apply for this position, it is mandatory to register on our platform at www.curataid.com and give 10 minutes technical quiz on Snowflake skill. Title: Senior Data Engineer Level: Consultant/Deputy Manager/Manager/Senior Manager Relevant Experience: Minimum of 5+ years of hands-on experience on Snowflake with DevOps, DBT, Airflow Must Have Skill: Data Engineering, Snowflake, DBT, Airflow & DevOps Location: Mumbai, Gurgaon, Bengaluru, Chennai, Kolkata, Bhubaneshwar, Coimbatore, Ahmedabad Qualifications 5+ years of relevant snowflake in a data engineering context. (Must Have) 4+ years of relevant experience in DBT, Airflow & DevOps . (Must Have) Strong hands-on experience with data modelling, data warehousing and building high-volume ETL/ELT pipelines. Must have experience with Cloud Data Warehouses like Snowflake, Amazon Redshift, Google Big Query or Azure Synapse Experience with version control systems (GitHub, BitBucket, GitLab). Strong SQL expertise. Implement best practices for data storage management, security, and retrieval efficiency. Experience with pipeline orchestration tools (Fivetran, Stitch, Airflow, etc.). Coding proficiency in at least one modern programming language (Python, Java, Scala, etc.). Show more Show less

Posted 1 month ago

Apply

6.0 - 10.0 years

7 - 10 Lacs

Mumbai, Delhi / NCR, Bengaluru

Work from Office

Data Engineering, AirFlow, Fivetran, CI/CD using We are seeking a Sr. Data Engineer to join our Data Engineering team within our Enterprise Data Insights organization to build data solutions, design and implement ETL/ELT processes and manage our data platform to enable our cross functional stakeholders. As a part of our Corporate Engineering division, our vision is to spearhead technology and data-led solutions and experiences to drive growth & innovation at scale. The ideal candidate will have a strong Data Engineering background, advanced Python knowledge and experience with cloud services and SQL/NoSQL databases. You will work closely with our cross functional stakeholders in Product, Finance and GTM along with Business and Enterprise Technology teams. As a Senior Data Engineer, you will: Collaborating closely with various stakeholders to prioritize requests, identify improvements, and offer recommendations. Taking the lead in analyzing, designing, and implementing data solutions, which involves constructing and designing data models and ETL processes. Cultivating collaboration with corporate engineering, product teams, and other engineering groups. Leading and mentoring engineering discussions, advocating for best practices. Actively participating in design and code reviews. Accessing and exploring third-party data APIs to determine the data required to meet business needs. Ensuring data quality and integrity across different sources and systems. Managing data pipelines for both analytics and operational purposes. Continuously enhancing processes and policies to improve SLA and SOX compliance. You'll be a great addition to the team if you have: Hold a B.S., M.S., or Ph.D. in Computer Science or a related technical field. Possess over 5 years of experience in Data Engineering, focusing on building and maintaining data environments. Demonstrate at least 5 years of experience in designing and constructing ETL/ELT processes, managing data solutions within an SLA-driven environment. Exhibit a strong background in developing data products, APIs, and maintaining testing, monitoring, isolation, and SLA processes. Possess advanced knowledge of SQL/NoSQL databases (such as Snowflake, Redshift, MongoDB). Proficient in programming with Python or other scripting languages. Have familiarity with columnar OLAP databases and data modeling. Experience in building ELT/ETL processes using tools like dbt, AirFlow, Fivetran, CI/CD using GitHub, and reporting in Tableau. Possess excellent communication and interpersonal skills to effectively collaborate with various business stakeholders and translate requirements. Added bonus if you also have: A good understanding of Salesforce & Netsuite systems Experience in SAAS environments Designed and deployed ML models Experience with events and streaming data Location-remote,Delhi NCR,Bengaluru,Chennai,Pune,Kolkata,Ahmedabad,Mumbai, Hyderabad

Posted 1 month ago

Apply

6.0 - 10.0 years

7 - 10 Lacs

Mumbai, Delhi / NCR, Bengaluru

Work from Office

Data Engineering, AirFlow, Fivetran, CI/CD using We are seeking a Sr. Data Engineer to join our Data Engineering team within our Enterprise Data Insights organization to build data solutions, design and implement ETL/ELT processes and manage our data platform to enable our cross functional stakeholders. As a part of our Corporate Engineering division, ourvision is to spearhead technology and data-led solutions and experiences to drive growth & innovation at scale. The ideal candidate will have a strong Data Engineering background, advanced Python knowledge and experience with cloud services and SQL/NoSQL databases. You will work closely with our cross functional stakeholders in Product, Finance and GTM along with Business and Enterprise Technology teams. As a Senior Data Engineer, you will: Collaborating closely with various stakeholders to prioritize requests, identify improvements, and offer recommendations. Taking the lead in analyzing, designing, and implementing data solutions, which involves constructing and designing data models and ETL processes. Cultivating collaboration with corporate engineering, product teams, and other engineering groups. Leading and mentoring engineering discussions, advocating for best practices. Actively participating in design and code reviews. Accessing and exploring third-party data APIs to determine the data required to meet business needs. Ensuring data quality and integrity across different sources and systems. Managing data pipelines for both analytics and operational purposes. Continuously enhancing processes and policies to improve SLA and SOX compliance. You'll be a great addition to the team if you have: Hold a B.S., M.S., or Ph.D. in Computer Science or a related technical field. Possess over 5 years of experience in Data Engineering, focusing on building and maintaining data environments. Demonstrate at least 5 years of experience in designing and constructing ETL/ELT processes, managing data solutions within an SLA-driven environment. Exhibit a strong background in developing data products, APIs, and maintaining testing, monitoring, isolation, and SLA processes. Possess advanced knowledge of SQL/NoSQL databases (such as Snowflake, Redshift, MongoDB). Proficient in programming with Python or other scripting languages. Have familiarity with columnar OLAP databases and data modeling. Experience in building ELT/ETL processes using tools like dbt, AirFlow, Fivetran, CI/CD using GitHub, and reporting in Tableau. Possess excellent communication and interpersonal skills to effectively collaborate with various business stakeholders and translate requirements. Added bonus if you also have: A good understanding of Salesforce & Netsuite systems Experience in SAAS environments Designed and deployed ML models Experience with events and streaming data Location- Remote,Delhi NCR,Bengaluru,Chennai,Pune,Kolkata,Ahmedabad,Mumbai, Hyderabad

Posted 1 month ago

Apply

15.0 - 19.0 years

0 Lacs

noida, uttar pradesh

On-site

As a Senior AI Architect at Dailoqa, you will play a pivotal role in shaping, designing, and delivering agentic AI solutions that drive real-world business value. You will collaborate with business and technology stakeholders, lead cross-functional teams, and ensure our AI architectures are robust, scalable, and aligned with our vision of combined intelligence and financial inclusion. Agentic AI Solution Design: Collaborate with stakeholders to identify high-impact agentic AI use cases, define success metrics, and determine data requirements tailored to Financial Services clients. Architect and oversee the implementation of end-to-end agentic AI solutions aligned with Dailoqa's strategic objectives and client needs. Leadership & Cross-Functional Collaboration: Lead and mentor cross-functional teams in the development and deployment of scalable agentic AI applications and infrastructures. Work closely with business stakeholders to translate complex requirements into actionable AI architecture and technical roadmaps. Technology Evaluation & Governance: Evaluate, recommend, and integrate advanced AI/ML platforms, frameworks, and technologies that enable agentic AI capabilities. Develop and enforce AI governance frameworks, best practices, and ethical standards, ensuring compliance with industry regulations and responsible AI principles. Performance Optimization & Continuous Improvement: Optimize AI models for performance, scalability, and efficiency, leveraging cloud-native and distributed computing resources. Stay ahead of emerging trends in agentic AI, machine learning, and data science, applying new insights to enhance solution quality and business impact. Technical Leadership & Talent Development: Provide technical leadership, mentorship, and code review for junior and peer team members. Participate in the hiring, onboarding, and development of AI talent, fostering a culture of innovation and excellence. Lead sprint planning, technical assessments, and ensure high standards in code quality and solution delivery. Required Qualifications: - 15+ years of Total experience. 8+ years in machine learning, and data science and more recent experience (4-5 yrs) in gen AI models applying AI to practical, comprehensive technology solutions and AI Consultancy. - Knowledge of basic algorithms, object-oriented and functional design principles, and best-practice patterns. - Experience in implementing GenAI, NLP, computer vision, or other AI frameworks/technologies. Tools & Technology: - LLMs and implementing RAG or different prompt strategies. - Azure OpenAI, Off the shelf Platform native AI tools and Models. - Knowledge of ML pipeline orchestration tools. - Experienced in python; ideally working knowledge of various supporting packages. - Experience in REST API development, NoSQL database design, and RDBMS design and optimizations. - Strong experience in Data engineering and aligned Hyperscale Platforms e.g. Databricks, Synapse, Fivetran etc. Education and Others Skills: - Master's or Ph.D. in Computer Science, Data Science, or related field. - Extensive experience with modern AI frameworks, cloud platforms, and big data technologies. - Strong background in designing and implementing AI solutions for enterprise-level applications. - Proven ability to lead and mentor technical teams. - Excellent communication skills with the ability to explain complex AI concepts to both technical and non-technical audiences. - Deep understanding of AI ethics and responsible AI practices. Working at Dailoqa will provide you with an opportunity to be part of a dynamic and innovative team that values collaboration, innovation, and continuous learning. If you are proactive, adaptable, and passionate about leveraging AI to solve real-world challenges in the financial services industry, then this role might be the perfect fit for you.,

Posted 1 month ago

Apply

5.0 - 9.0 years

0 Lacs

pune, maharashtra

On-site

You are a Sr. Data Engineer with over 7 years of experience, specializing in Data Engineering, Python, and SQL. You will be a part of the Data Engineering team in the Enterprise Data Insights organization, responsible for building data solutions, designing ETL/ELT processes, and managing the data platform to support various stakeholders across the organization. Your role is crucial in driving technology and data-led solutions to foster growth and innovation at scale. Your responsibilities as a Senior Data Engineer include collaborating with cross-functional stakeholders to prioritize requests, identify areas for improvement, and provide recommendations. You will lead the analysis, design, and implementation of data solutions, including constructing data models and ETL processes. Furthermore, you will engage in fostering collaboration with corporate engineering, product teams, and other engineering groups, while also leading and mentoring engineering discussions and advocating for best practices. To excel in this role, you should possess a degree in Computer Science or a related technical field and have a proven track record of over 5 years in Data Engineering. Your expertise should include designing and constructing ETL/ELT processes, managing data solutions within an SLA-driven environment, and developing data products and APIs. Proficiency in SQL/NoSQL databases, particularly Snowflake, Redshift, or MongoDB, along with strong programming skills in Python, is essential. Additionally, experience with columnar OLAP databases, data modeling, and tools like dbt, AirFlow, Fivetran, GitHub, and Tableau reporting will be beneficial. Good communication and interpersonal skills are crucial for effectively collaborating with business stakeholders and translating requirements into actionable insights. An added advantage would be a good understanding of Salesforce & Netsuite systems, experience in SAAS environments, designing and deploying ML models, and familiarity with events and streaming data. Join us in driving data-driven solutions and experiences to shape the future of technology and innovation.,

Posted 1 month ago

Apply

6.0 - 9.0 years

8 - 11 Lacs

Hyderabad, Bengaluru, Delhi / NCR

Work from Office

We are seeking a Sr. Data Engineer to join our Data Engineering team within our Enterprise Data Insights organization to build data solutions, design and implement ETL/ELT processes and manage our data platform to enable our cross functional stakeholders. As a part of our Corporate Engineering division, our vision is to spearhead technology and data-led solutions and experiences to drive growth & innovation at scale. The ideal candidate will have a strong Data Engineering background, advanced Python knowledge and experience with cloud services and SQL/NoSQL databases. You will work closely with our cross functional stakeholders in Product, Finance and GTM along with Business and Enterprise Technology teams. As a Senior Data Engineer, you will: Collaborating closely with various stakeholders to prioritize requests, identify improvements, and offer recommendations. Taking the lead in analyzing, designing, and implementing data solutions, which involves constructing and designing data models and ETL processes. Cultivating collaboration with corporate engineering, product teams, and other engineering groups. Leading and mentoring engineering discussions, advocating for best practices. Actively participating in design and code reviews. Accessing and exploring third-party data APIs to determine the data required to meet business needs. Ensuring data quality and integrity across different sources and systems. Managing data pipelines for both analytics and operational purposes. Continuously enhancing processes and policies to improve SLA and SOX compliance. You'll be a great addition to the team if you have: Hold a B.S., M.S., or Ph.D. in Computer Science or a related technical field. Possess over 5 years of experience in Data Engineering, focusing on building and maintaining data environments. Demonstrate at least 5 years of experience in designing and constructing ETL/ELT processes, managing data solutions within an SLA-driven environment. Exhibit a strong background in developing data products, APIs, and maintaining testing, monitoring, isolation, and SLA processes. Possess advanced knowledge of SQL/NoSQL databases (such as Snowflake, Redshift, MongoDB). Proficient in programming with Python or other scripting languages. Have familiarity with columnar OLAP databases and data modeling. Experience in building ELT/ETL processes using tools like dbt, AirFlow, Fivetran, CI/CD using GitHub, and reporting in Tableau. Possess excellent communication and interpersonal skills to effectively collaborate with various business stakeholders and translate requirements. Added bonus if you also have: A good understanding of Salesforce & Netsuite systems Experience in SAAS environments Designed and deployed ML models Experience with events and streaming data Location: Remote- Bengaluru,Hyderabad,Delhi / NCR,Chennai,Pune,Kolkata,Ahmedabad,Mumbai

Posted 1 month ago

Apply

10.0 - 14.0 years

8 - 15 Lacs

Hyderabad, Bengaluru, Delhi / NCR

Hybrid

We are seeking a Sr. Data Engineer to join our Data Engineering team within our Enterprise Data Insights organization to build data solutions, design and implement ETL/ELT processes and manage our data platform to enable our cross functional stakeholders. As a part of our Corporate Engineering division, our vision is to spearhead technology and data-led solutions and experiences to drive growth & innovation at scale. The ideal candidate will have a strong Data Engineering background, advanced Python knowledge and experience with cloud services and SQL/NoSQL databases. You will work closely with our cross functional stakeholders in Product, Finance and GTM along with Business and Enterprise Technology teams. As a Senior Data Engineer, you will: Collaborating closely with various stakeholders to prioritize requests, identify improvements, and offer recommendations. Taking the lead in analyzing, designing, and implementing data solutions, which involves constructing and designing data models and ETL processes. Cultivating collaboration with corporate engineering, product teams, and other engineering groups. Leading and mentoring engineering discussions, advocating for best practices. Actively participating in design and code reviews. Accessing and exploring third-party data APIs to determine the data required to meet business needs. Ensuring data quality and integrity across different sources and systems. Managing data pipelines for both analytics and operational purposes. Continuously enhancing processes and policies to improve SLA and SOX compliance. You'll be a great addition to the team if you have: Hold a B.S., M.S., or Ph.D. in Computer Science or a related technical field. Possess over 5 years of experience in Data Engineering, focusing on building and maintaining data environments. Demonstrate at least 5 years of experience in designing and constructing ETL/ELT processes, managing data solutions within an SLA-driven environment. Exhibit a strong background in developing data products, APIs, and maintaining testing, monitoring, isolation, and SLA processes. Possess advanced knowledge of SQL/NoSQL databases (such as Snowflake, Redshift, MongoDB). Proficient in programming with Python or other scripting languages. Have familiarity with columnar OLAP databases and data modeling. Experience in building ELT/ETL processes using tools like dbt, AirFlow, Fivetran, CI/CD using GitHub, and reporting in Tableau. Possess excellent communication and interpersonal skills to effectively collaborate with various business stakeholders and translate requiremaob Title: Senior Software Engineer Full Stack Location: Remote- Bengaluru,Hyderabad,Delhi / NCR,Chennai,Pune,Kolkata,Ahmedabad,Mumbai Timings: 11 AM 8 PM IST

Posted 1 month ago

Apply

6.0 - 11.0 years

7 - 10 Lacs

Mumbai, Delhi / NCR, Bengaluru

Work from Office

We are seeking a Sr. Data Engineer to join our Data Engineering team within our Enterprise Data Insights organization to build data solutions, design and implement ETL/ELT processes and manage our data platform to enable our cross functional stakeholders. As a part of our Corporate Engineering division, our vision is to spearhead technology and data-led solutions and experiences to drive growth & innovation at scale. The ideal candidate will have a strong Data Engineering background, advanced Python knowledge and experience with cloud services and SQL/NoSQL databases. You will work closely with our cross functional stakeholders in Product, Finance and GTM along with Business and Enterprise Technology teams. As a Senior Data Engineer, you will: Collaborating closely with various stakeholders to prioritize requests, identify improvements, and offer recommendations. Taking the lead in analyzing, designing, and implementing data solutions, which involves constructing and designing data models and ETL processes. Cultivating collaboration with corporate engineering, product teams, and other engineering groups. Leading and mentoring engineering discussions, advocating for best practices. Actively participating in design and code reviews. Accessing and exploring third-party data APIs to determine the data required to meet business needs. Ensuring data quality and integrity across different sources and systems. Managing data pipelines for both analytics and operational purposes. Continuously enhancing processes and policies to improve SLA and SOX compliance. You'll be a great addition to the team if you have: Hold a B.S., M.S., or Ph.D. in Computer Science or a related technical field. Possess over 5 years of experience in Data Engineering, focusing on building and maintaining data environments. Demonstrate at least 5 years of experience in designing and constructing ETL/ELT processes, managing data solutions within an SLA-driven environment. Exhibit a strong background in developing data products, APIs, and maintaining testing, monitoring, isolation, and SLA processes. Possess advanced knowledge of SQL/NoSQL databases (such as Snowflake, Redshift, MongoDB). Proficient in programming with Python or other scripting languages. Have familiarity with columnar OLAP databases and data modeling. Experience in building ELT/ETL processes using tools like dbt, AirFlow, Fivetran, CI/CD using GitHub, and reporting in Tableau. Possess excellent communication and interpersonal skills to effectively collaborate with various business stakeholders and translate requirements. Added bonus if you also have: A good understanding of Salesforce & Netsuite systems Experience in SAAS environments Designed and deployed ML models Experience with events and streaming data Location - Mumbai, Delhi / NCR, Bengaluru , Kolkata, Chennai, Hyderabad, Ahmedabad, Pune, Remote

Posted 1 month ago

Apply

5.0 - 10.0 years

20 - 35 Lacs

Hyderabad, Pune

Work from Office

We are seeking a Sr. Data Engineer to join our Data Engineering team within our Enterprise Data Insights organization to build data solutions, design and implement ETL/ELT processes and manage our data platform to enable our cross functional stakeholders. As a part of our Corporate Engineering division, our vision is to spearhead technology and data-led solutions and experiences to drive growth & innovation at scale. The ideal candidate will have a strong Data Engineering background, advanced Python knowledge and experience with cloud services and SQL/NoSQL databases. You will work closely with our cross functional stakeholders in Product, Finance and GTM along with Business and Enterprise Technology teams. As a Senior Data Engineer, you will: Collaborating closely with various stakeholders to prioritize requests, identify improvements, and offer recommendations. Taking the lead in analyzing, designing, and implementing data solutions, which involves constructing and designing data models and ETL processes. Cultivating collaboration with corporate engineering, product teams, and other engineering groups. Leading and mentoring engineering discussions, advocating for best practices. Actively participating in design and code reviews. Accessing and exploring third-party data APIs to determine the data required to meet business needs. Ensuring data quality and integrity across different sources and systems. Managing data pipelines for both analytics and operational purposes. Continuously enhancing processes and policies to improve SLA and SOX compliance. You'll be a great addition to the team if you have: Hold a B.S., M.S., or Ph.D. in Computer Science or a related technical field. Possess over 5 years of experience in Data Engineering, focusing on building and maintaining data environments. Demonstrate at least 5 years of experience in designing and constructing ETL/ELT processes, managing data solutions within an SLA-driven environment. Exhibit a strong background in developing data products, APIs, and maintaining testing, monitoring, isolation, and SLA processes. Possess advanced knowledge of SQL/NoSQL databases (such as Snowflake, Redshift, MongoDB). Proficient in programming with Python or other scripting languages. Have familiarity with columnar OLAP databases and data modeling. Experience in building ELT/ETL processes using tools like dbt, AirFlow, Fivetran, CI/CD using GitHub, and reporting in Tableau. Possess excellent communication and interpersonal skills to effectively collaborate with various business stakeholders and translate requirements. Added bonus if you also have: A good understanding of Salesforce & Netsuite systems Experience in SAAS environments Designed and deployed ML models Experience with events and streaming data

Posted 1 month ago

Apply
cta

Start Your Job Search Today

Browse through a variety of job opportunities tailored to your skills and preferences. Filter by location, experience, salary, and more to find your perfect fit.

Job Application AI Bot

Job Application AI Bot

Apply to 20+ Portals in one click

Download Now

Download the Mobile App

Instantly access job listings, apply easily, and track applications.

Featured Companies