Get alerts for new jobs matching your selected skills, preferred locations, and experience range. Manage Job Alerts
4.0 - 7.0 years
7 - 17 Lacs
Bengaluru, Delhi / NCR, Mumbai (All Areas)
Work from Office
Key Responsibilities: Design, develop, and maintain data transformation pipelines using dbt/IICS on Snowflake . Write optimized SQL and Python scripts for complex data modeling and processing tasks. Collaborate with data analysts, engineers, and business teams to implement scalable ELT workflows. Create and manage data models , schemas , and documentation in dbt . Optimize Snowflake performance using best practices (clustering, caching, virtual warehouses). Manage data integration from data lakes , external systems, and cloud sources. Ensure data quality, lineage, version control, and compliance across all environments. Participate in code reviews, testing, and deployment activities using CI/CD pipelines. Required Skills: 58 years of experience in Data Engineering or Data Platform Development . Hands-on experience with Snowflake – data warehousing, architecture, and performance tuning. Proficient in dbt (Data Build Tool) – model creation, Jinja templates, macros, testing, and documentation. Hands-on experience in creating mapping and workflows in IICS and have extensive experience in performance tuning and troubleshooting activities Strong Python scripting for data transformation and automation. Advanced skills in SQL – writing, debugging, and tuning queries. Experience with Data Lake and Data Warehouse concepts and implementations. Familiarity with Git-based workflows and version control in dbt projects. Preferred Skills (Good to Have): Experience with Airflow , Dagster , or other orchestration tools. Knowledge of cloud platforms like AWS, Azure, or GCP. Exposure to BI tools like Power BI , Tableau , or Looker . Understanding Data Governance , Security , and Compliance . Experience in leading a development team
Posted 2 weeks ago
5.0 - 9.0 years
0 Lacs
karnataka
On-site
We are looking for a Data Scientist to join our dynamic team dedicated to developing cutting-edge AI-powered search and research tools that are revolutionizing how teams access information and make informed decisions. As a Data Scientist, you will play a crucial role in transforming complex datasets into valuable insights, making an impact at the forefront of productivity and intelligence tool development. Your responsibilities will include owning and managing the data transformation layer using dbt and SQL, designing scalable data models, maintaining business logic, creating intuitive dashboards and visualizations using modern BI tools, collaborating with various teams to uncover key insights, working with diverse structured and unstructured data sources such as Snowflake and MongoDB, and translating business questions into data-driven recommendations. Additionally, you will support experimentation and A/B testing strategies across teams. The ideal candidate for this role will have a minimum of 4-8 years of experience in analytics, data engineering, or BI roles, with strong proficiency in SQL, dbt, and Python (pandas, plotly, etc.). Experience with BI tools, dashboard creation, and working with multiple data sources is essential. Excellent communication skills are a must as you will collaborate across global teams. Familiarity with Snowflake, MongoDB, Airflow, startup experience, or a background in experimentation is considered a bonus. Joining our team means being part of a global effort to redefine enterprise search and research with a clear vision and strong leadership. If you are passionate about solving complex data challenges, enjoy working independently, and thrive in a collaborative environment with brilliant minds, this role offers an exciting opportunity for professional growth and innovation. Location: Abu Dhabi Experience: 4-8 years Role Type: Individual Contributor | Reports to Team Lead (Abu Dhabi),
Posted 2 weeks ago
6.0 - 10.0 years
0 Lacs
maharashtra
On-site
You will be joining a renowned consulting firm known for being consistently ranked as one of the world's best places to work. The company has maintained a top position on Glassdoor's Best Places to Work list since 2009, emphasizing the importance of extraordinary teams in their business strategy. By intentionally bringing together diverse backgrounds, cultures, experiences, perspectives, and skills in a supportive and inclusive work environment, they ensure that every individual can thrive both professionally and personally. As part of the Application Engineering experts team within the AI, Insights & Solutions division, you will collaborate with a multidisciplinary group of professionals including analytics, engineering, product management, and design experts. Your role will involve leveraging deep technical expertise along with business acumen to assist clients in addressing their most transformative challenges. Working in integrated teams, you will develop data-driven strategies and innovative solutions to drive competitive advantage for clients by harnessing the power of data and artificial intelligence. Your responsibilities will include designing, developing, and maintaining cloud-based AI applications using a full-stack technology stack to deliver high-quality, scalable, and secure solutions. You will collaborate with cross-functional teams to define and implement analytics features, utilize Kubernetes and containerization technologies for deployment, develop APIs and microservices, ensure robust security measures, monitor application performance, contribute to coding standards, stay updated on emerging technologies, automate deployment processes, and collaborate closely with clients to assess opportunities and develop analytics solutions. To qualify for this position, you are required to have a Master's degree in Computer Science, Engineering, or a related technical field, along with at least 6 years of experience at a Senior or Staff level. Proficiency in client-side and server-side technologies, cloud platforms, Python, Git, DevOps, CI/CD, and various other technical skills is necessary. Additionally, strong interpersonal and communication skills, curiosity, proactivity, critical thinking, and a solid foundation in computer science fundamentals are essential for this role. This role also requires a willingness to travel up to 30% of the time. If you are looking for an opportunity to work in a collaborative and supportive environment, continuously learn and grow, and contribute to developing cutting-edge analytics solutions for clients across different sectors, this position may be the perfect fit for you.,
Posted 2 weeks ago
5.0 - 9.0 years
0 Lacs
pune, maharashtra
On-site
You will be responsible for designing, developing, and maintaining dashboards and reports using Sigma Computing. Your main focus will be on collaborating with business stakeholders to understand data requirements and deliver actionable insights. It will be crucial for you to write and optimize SQL queries that run directly on cloud data warehouses. Additionally, enabling self-service analytics for business users via Sigma's spreadsheet interface and templates will be part of your responsibilities. You will need to apply row-level security and user-level filters to ensure proper data access controls. Furthermore, you will work closely with data engineering teams to validate data accuracy and ensure model alignment. Troubleshooting performance or data issues in reports and dashboards will also be a key aspect of your role. You will be expected to train and support users on Sigma best practices, tools, and data literacy. To excel in this role, you should have at least 5 years of experience in Business Intelligence, Analytics, or Data Visualization roles. Hands-on experience with Sigma Computing is highly preferred. Strong SQL skills and experience working with cloud data platforms such as Snowflake, BigQuery, or Redshift are essential. Familiarity with data modeling concepts and modern data stacks is required. Your ability to translate business requirements into technical solutions will be crucial. Knowledge of data governance, security, and role-based access controls is important. Excellent communication and stakeholder management skills are necessary for effective collaboration. Experience with tools like Looker, Tableau, Power BI, or similar ones will be beneficial for comparative insights. Familiarity with dbt, Fivetran, or other ELT/ETL tools is a plus. Exposure to Agile or Scrum methodologies would also be advantageous.,
Posted 2 weeks ago
8.0 - 12.0 years
0 Lacs
pune, maharashtra
On-site
As a Tech Lead, Data Architecture at Fiserv, you will play a crucial role in our data warehousing strategy and implementation. Your responsibilities will include designing, developing, and leading the adoption of Snowflake-based solutions to ensure efficient and secure data systems that drive our business analytics and decision-making processes. Collaborating with cross-functional teams, you will define and implement best practices for data modeling, schema design, and query optimization in Snowflake. Additionally, you will develop and manage ETL/ELT workflows to ingest, transform, and load data from various resources into Snowflake, integrating data from diverse systems like databases, APIs, flat files, and cloud storage. Monitoring and tuning Snowflake performance, you will manage caching, clustering, and partitioning to enhance efficiency while analyzing and resolving query performance bottlenecks. You will work closely with data analysts, data engineers, and business users to understand reporting and analytic needs, ensuring seamless integration with BI Tools like Power BI. Your role will also involve collaborating with the DevOps team for automation, deployment, and monitoring, as well as planning and executing strategies for scaling Snowflake environments as data volume grows. Keeping up to date with emerging trends and technologies in data warehousing and data management is essential, along with providing technical support, troubleshooting, and guidance to users accessing the data warehouse. To be successful in this role, you must have 8 to 10 years of experience in data management tools like Snowflake, Streamsets, and Informatica. Experience with monitoring tools like Dynatrace and Splunk, Kubernetes cluster management, and Linux OS is required. Additionally, familiarity with containerization technologies, cloud services, CI/CD pipelines, and banking or financial services experience would be advantageous. Thank you for considering employment with Fiserv. To apply, please use your legal name, complete the step-by-step profile, and attach your resume. Fiserv is committed to diversity and inclusion and does not accept resume submissions from agencies outside of existing agreements. Beware of fraudulent job postings not affiliated with Fiserv to protect your personal information and financial security.,
Posted 2 weeks ago
9.0 - 13.0 years
0 Lacs
hyderabad, telangana
On-site
You have a great opportunity to join our team as a Data Architect with 9+ years of experience. In this role, you will be responsible for designing, implementing, and managing cloud-based solutions on AWS and Snowflake. Your main tasks will include working with stakeholders to gather requirements, designing solutions, developing and executing test plans, and overseeing the information architecture for the data warehouse. To excel in this role, you must have a strong skillset in Snowflake, DBT, and Data Architecture Design experience in Data Warehouse. Additionally, it would be beneficial to have Informatica or any ETL Knowledge or Hands-On Experience, as well as Databricks understanding. You should have 9 - 11 years of IT experience with 3+ years of Data Architecture experience in Data Warehouse and 4+ years in Snowflake. As a Data Architect, you will need to optimize Snowflake configurations and data pipelines to improve performance, scalability, and overall efficiency. You should have a deep understanding of Data Warehousing, Enterprise Architectures, Dimensional Modeling, Star & Snowflake schema design, Reference DW Architectures, ETL Architect, ETL (Extract, Transform, Load), Data Analysis, Data Conversion, Transformation, Database Design, Data Warehouse Optimization, Data Mart Development, and Enterprise Data Warehouse Maintenance and Support. In addition to your technical responsibilities, you will also be required to maintain detailed documentation for data solutions and processes, provide training and leadership to share expertise and best practices with the team, and collaborate with the data engineering team to ensure that data solutions are developed according to best practices. If you have 10+ years of overall experience in architecting and building large-scale, distributed big data products, expertise in designing and implementing highly scalable, highly available Cloud services and solutions, experience with AWS and Snowflake, as well as a strong understanding of data warehousing and data engineering principles, then this role is perfect for you. This is a full-time position based in Hyderabad, Telangana, with a Monday to Friday work schedule. Therefore, you must be able to reliably commute or plan to relocate before starting work. As part of the application process, we would like to know your notice period, years of experience in Snowflake, Data Architecture experience in Data Warehouse, current location, willingness to work from the office in Hyderabad, current CTC, and expected CTC. If you meet the requirements and are excited about this opportunity, we look forward to receiving your application. (Note: Experience: total work: 9 years is required for this position),
Posted 2 weeks ago
2.0 - 6.0 years
0 Lacs
karnataka
On-site
At PwC, our managed services team focuses on providing outsourced solutions and support to clients across various functions. We help organizations streamline operations, reduce costs, and enhance efficiency by managing key processes and functions on their behalf. Our team is skilled in project management, technology, and process optimization to deliver high-quality services to clients. Those in managed service management and strategy at PwC are responsible for transitioning and running services, managing delivery teams, programs, commercials, performance, and delivery risk. Your role will involve continuous improvement and optimization of managed services processes, tools, and services. As an Associate at PwC, you will work as part of a team of problem solvers, assisting in solving complex business issues from strategy to execution. Professional skills and responsibilities at this level include using feedback and reflection to develop self-awareness, demonstrating critical thinking, and bringing order to unstructured problems. You will be involved in ticket quality review, status reporting for projects, adherence to SLAs, incident management, change management, and problem management. Additionally, you will seek opportunities for exposure to different situations, environments, and perspectives, uphold the firm's code of ethics, demonstrate leadership capabilities, and work in a team environment that includes client interactions and cross-team collaboration. Required Skills: - AWS Cloud Engineer - Minimum 2 years of hands-on experience in building advanced data warehousing solutions on leading cloud platforms - Minimum 1-3 years of Operate/Managed Services/Production Support Experience - Extensive experience in developing scalable, repeatable, and secure data structures and pipelines - Designing and implementing data pipelines for data ingestion, processing, and transformation in AWS - Building efficient ETL/ELT processes using industry-leading tools like AWS, PySpark, SQL, Python, etc. - Implementing data validation and cleansing procedures - Monitoring and troubleshooting data pipelines - Implementing and maintaining data security and privacy measures - Strong communication, problem-solving, quantitative, and analytical abilities Nice To Have: - AWS certification In our Managed Services platform, we deliver integrated services and solutions grounded in deep industry experience and powered by talent. Our team provides scalable solutions that add value to our clients" enterprise through technology and human-enabled experiences. We focus on empowering clients to navigate and capture the value of their Data & Analytics portfolio while cost-effectively operating and protecting their solutions. As a member of our Data, Analytics & Insights Managed Service team, you will work on critical offerings, help desk support, enhancement, optimization work, and strategic roadmap and advisory level work. Your contribution will be crucial in supporting customer engagements both technically and relationally.,
Posted 2 weeks ago
6.0 - 11.0 years
15 - 30 Lacs
Navi Mumbai, Pune, Bengaluru
Work from Office
Dear Candidate, Hope you are doing well. Greeting from NAM Info INC. NAM Info Inc. is a technology-forward talent management organization dedicated to bridging the gap between industry leaders and exceptional human resources. They pride themselves on delivering quality candidates, deep industry coverage, and knowledge-based training for consultants. Their commitment to long-term partnerships, rooted in ethical practices and trust, positions them as a preferred partner for many industries. Learn more about their vision, achievements, and services on their website at www.nam-it.com. We have an open position for Data Engineer role with our company for Bangalore, Pune and Mumbai location. Job Description Position: Sr / Lead Data Engineer Location: Bangalore, Pune and Mumbai Experience: 5 + years Required Skills: Azure, Data warehouse, Python, Spark, PySpark, Snowflake / Databricks, Any RDBMS, Any ETL Tool, SQL, Unix Scripting, GitHub Strong experience in Azure / AWS / GCP Permanent with NAM Info Pvt Ltd Work Location: Bangalore, Pune and Mumbai Working time: 12 PM to 9 PM or 2 PM to 11 PM 5 Days work from office, Monday to Friday L1 interview virtual, L2 face to face at Banashankari office (for Bangalore candidate) Notice period immediate to 15 days If you are fine with the above job details then please share your resume to ananya.das@nam-it.com Regards, Recruitment Team NAM Info INC
Posted 2 weeks ago
5.0 - 10.0 years
15 - 30 Lacs
Pune, Chennai, Bengaluru
Work from Office
Experience- 5+ Years Work Location - Mumbai, Pune, Chennai, Bangalore Company - MNC. Notice Period- Immediate - 15 days Role - Snowflake Developer Job Description Bachelor's degree in computer science, engineering, or similar quantitative field 5+ years of relevant experience developing backend, integration, data pipelining, and infrastructure Strong expertise in Python, PySpark, and Snowpark Proven experience with Snowflake and AWS cloud platforms Experience with Informatica/IICS for data integration Good experience in DBT & Airflow Experience with data warehousing and writing efficient SQL queries Understanding of data structures and algorithms preferred. Knowledge of DevOps best practices and associated tools: Containers and containerization technologies (Kubernetes, Argo, Red Hat OpenShift) Infrastructure as code (Terraform) CI/CD Pipelines (JFrog Artifactory) Scripting and automation (Python, GitHub, GitHub actions) With due regards Interested candidates kindly share their resume at piyush.kumar@axiomsoftwaresolutions.com
Posted 2 weeks ago
2.0 - 6.0 years
5 - 9 Lacs
Bengaluru, Karnataka, India
On-site
Join our Digitalization Technology and Services (DTS) team based in Bangalore. You'll make a difference by: Developing and delivering parts of a product, in accordance with the customers requirements and organizational quality norms. Activities to be performed include: Implementation of features and/or bug-fixing and delivering solutions in accordance with coding guidelines and on-time with high quality. Identification and implementation of test strategy to ensure solution addresses customer requirements, and quality, security requirements of product are met. Com municating within the team as well as with all the stake holders Job Requirements/ Skills: 4-6 years work experience in Software Engineering especially in professional software product development. Strong Experience in Snowflake Database and Tools Strong knowledge in RDBMS, Stored Procedures and Triggers Strong Knowledge in DBT Basic knowledge in AWS services Knowledge in any programming languages like Python or Java. Knowledge of Software Engineering processes. Basic Experience with Agile/Lean and SAFe practices is preferred. Knowledge of source code management tools like git
Posted 2 weeks ago
1.0 - 3.0 years
1 - 3 Lacs
Hyderabad, Telangana, India
On-site
Snowflake and DBT Expertise Strong hands-on experience in Snowflake and DBT. Data Loading Able to load data from various sources, supporting both batch and real-time ingestion. SQL Procedures Excellent at writing and optimizing SQL procedures. Technical Guidance Capable of guiding the team on technical matters and best practices. Support Flexibility Flexible to work during expected support hours. Enhancement Delivery Responsible for delivering enhancements on time with high quality. Performance Optimization Experienced in suggesting and implementing performance improvement ideas. Issue Analysis Able to independently analyze data issues and communicate clearly with customers. Requirement Accuracy Pays close attention to detail when gathering requirements or providing updates. Client Collaboration Works directly with the client to define and create a future roadmap.
Posted 2 weeks ago
5.0 - 9.0 years
0 Lacs
pune, maharashtra
On-site
As the Lead Data Engineer at Mastercard, you will be responsible for designing and building scalable, cloud-native data platforms using PySpark, Python, and modern data engineering practices. Your role will involve mentoring and guiding other engineers, fostering a culture of curiosity and continuous improvement, and creating robust ETL/ELT pipelines to serve business-critical use cases. You will lead by example by writing high-quality, testable code, participating in architecture and design discussions, and decomposing complex problems into scalable components aligned with platform and product goals. Championing best practices in data engineering, you will drive collaboration across teams, support data governance and quality efforts, and continuously learn and apply new technologies to improve team productivity and platform reliability. To succeed in this role, you should have at least 5 years of hands-on experience in data engineering with strong PySpark and Python skills. You should also possess solid experience in designing and implementing data models, pipelines, and batch/stream processing systems. Additionally, you should be comfortable working with cloud platforms such as AWS, Azure, or GCP and have a strong foundation in data modeling, database design, and performance optimization. A bachelor's degree in computer science, engineering, or a related field is required, along with experience in Agile/Scrum development environments. Experience with CI/CD practices, version control, and automated testing is essential, as well as the ability to mentor and uplift junior engineers. Familiarity with cloud-related services like S3, Glue, Data Factory, and Databricks is highly desirable. Furthermore, exposure to data governance tools and practices, orchestration tools, containerization, and infrastructure automation will be advantageous. A master's degree, relevant certifications, or contributions to open source/data engineering communities will be considered a bonus. Exposure to machine learning data pipelines or MLOps is also a plus. If you are a curious, adaptable, and driven individual who enjoys problem-solving and continuous improvement, and if you have a passion for building clean data pipelines and cloud-native designs, then this role is perfect for you. Join us at Mastercard and be part of a team that is dedicated to unlocking the potential of data assets and shaping the future of data engineering.,
Posted 2 weeks ago
0.0 years
0 Lacs
Hyderabad, Telangana, India
On-site
Ready to shape the future of work At Genpact, we don&rsquot just adapt to change&mdashwe drive it. AI and digital innovation are redefining industries, and we&rsquore leading the charge. Genpact&rsquos , our industry-first accelerator, is an example of how we&rsquore scaling advanced technology solutions to help global enterprises work smarter, grow faster, and transform at scale. From large-scale models to , our breakthrough solutions tackle companies most complex challenges. If you thrive in a fast-moving, tech-driven environment, love solving real-world problems, and want to be part of a team that&rsquos shaping the future, this is your moment. Genpact (NYSE: G) is an advanced technology services and solutions company that delivers lasting value for leading enterprises globally. Through our deep business knowledge, operational excellence, and cutting-edge solutions - we help companies across industries get ahead and stay ahead. Powered by curiosity, courage, and innovation , our teams implement data, technology, and AI to create tomorrow, today. Get to know us at and on , , , and . Inviting applications for the role of Assistant Vice President, Enterprise Architecture Consulting-GCP Delivery lead The Delivery Lead will be accountable for the effective execution of extensive data transformation initiatives utilizing Google Cloud Platform. This leadership position entails supervising both legacy-to-GCP migrations and new implementations, guaranteeing high-quality delivery, innovation, and business value. The suitable candidate should possess significant experience in GCP program management, as well as proficiency in data engineering, cloud platforms, and analytics solutions. They will be tasked with client engagement, team leadership, delivery governance, and strategic innovations in GCP-based solutions. Key Responsibilities: Lead end-to-end delivery of GCP projects , including migrations from legacy systems and greenfield implementations . Define and enforce delivery governance frameworks , best practices, and methodologies for GCP programs. Act as the primary interface for clients , ensuring strong relationships and alignment with their data strategy. Offer expert guidance on GCP and contemporary data architectures such as Data mesh/fabric methodology , while possessing substantial experience in SSOT frameworks and guiding clients on best practices. Possessing knowledge of containerization architecture is essential, along with experience in data vault data modeling. . Build, mentor, and manage a high-performing team of GCP architects, data engineers, and analysts. Drive team upskilling and certifications in GCP, data engineering, and analytics tools. Foster a strong DevOps and Agile culture , ensuring efficient execution through CI/CD automation. Stay ahead of emerging trends in GCP, cloud data engineering, and analytics to drive innovation. Must possess substantial experience in advanced GCP techniques such as BI engine and history-based optimization, among others. Should have a comprehensive understanding and practical experience with GenAI and Agentic AI. The individual is expected to review the architectural deck and offer solutions for identified pain points, ensuring successful project delivery. Proficient in ELT solutioning utilizing DBT and the native services of BQ-Dataform. Promote AI/ML, automation, and real-time analytics to enhance data platform capabilities. Develop accelerators, reusable frameworks, and best practices for efficient delivery. Ensure data security, compliance, and regulatory adherence in projects. Implement performance monitoring, cost optimization, and disaster recovery strategies for GCP solutions. Minimum Qualifications Bachelor&rsquos degree in Computer Science , Engineering, or a related field (Master&rsquos or MBA preferred). experience in IT Services, with exp in GCP and cloud-based data solutions. Preferred Qualifications/ Skills Proven track record in managing large-scale GCP programs , including legacy data migrations and new implementations. Deep understanding of data engineering, ETL, and cloud-native architectures. Strong expertise in GCP ecosystems, including Streams, Orchestration, Ingestion, Governance , stewardship , ELT/ETL, Tasks, Data Sharing, and Performance Optimization. Experience with cloud platforms (AWS, Azure). Proficiency in SQL, Python, Spark, and modern data processing frameworks. Preferred Certifications: Certified GCP Solution Architect. Cloud certifications (AWS Certified Data Analytics, Azure Data Engineering, or equivalent). PMP, ITIL, or SAFe Agile certifications for delivery governance. Why join Genpact Be a transformation leader - Work at the cutting edge of AI, automation, and digital innovation Make an impact - Drive change for global enterprises and solve business challenges that matter Accelerate your career - Get hands-on experience, mentorship, and continuous learning opportunities Work with the best - Join 140,000+ bold thinkers and problem-solvers who push boundaries every day Thrive in a values-driven culture - Our courage, curiosity, and incisiveness - built on a foundation of integrity and inclusion - allow your ideas to fuel progress Come join the tech shapers and growth makers at Genpact and take your career in the only direction that matters: Up. Let&rsquos build tomorrow together. Genpact is an Equal Opportunity Employer and considers applicants for all positions without regard to race, color , religion or belief, sex, age, national origin, citizenship status, marital status, military/veteran status, genetic information, sexual orientation, gender identity, physical or mental disability or any other characteristic protected by applicable laws. Genpact is committed to creating a dynamic work environment that values respect and integrity, customer focus, and innovation. Furthermore, please do note that Genpact does not charge fees to process job applications and applicants are not required to pay to participate in our hiring process in any other way. Examples of such scams include purchasing a %27starter kit,%27 paying to apply, or purchasing equipment or training.
Posted 2 weeks ago
1.0 - 3.0 years
7 - 9 Lacs
Hyderabad, Telangana, India
On-site
Looking for a person with good handson experience in Snowflake and DBT. Should be able to load data from different sources both batchwise and realtime. Should be excellent with writing SQL procedures Should be able to guide the team technically Person should be flexible to work on expected support hrs. Should be able to deliver enhancements ontime. Should have experience in coming up performance improvement ideas. Should be able to analyse data issues and communicate clearly with customer independently. Should pay attention to details while taking requirement from customer or sharing updates. Able to work with client and create a future roadmap Mandatory skillsSQL, Snowflake, DBT
Posted 2 weeks ago
7.0 - 11.0 years
0 Lacs
maharashtra
On-site
As a Senior Data Engineer at Assent, you will play a critical role in advancing data engineering practices and serving as a hands-on technical expert. You will contribute significantly to the vision and execution of data engineering efforts, ensuring the development of secure, robust, scalable, and high-performing data platforms. Your role involves driving key technical initiatives, collaborating on architectural decisions, and executing complex data engineering tasks to align data infrastructure with Assent's business goals. You will actively participate in the design, development, and implementation of sophisticated data solutions while offering mentorship and technical guidance to other engineers. Your contributions will have a broad impact across the organization as you champion best practices, innovate to enhance data engineering capabilities, and work directly on the technical solutions required to advance data systems. Key Requirements & Responsibilities: - Contribute to the strategic vision for data engineering and participate in the architectural design and development of new and complex data solutions, focusing on scalability, performance, and hands-on implementation. - Design and implement new data systems and infrastructure to ensure the reliability and scalability of data systems by actively contributing to day-to-day engineering tasks. - Influence key decisions regarding the data technology stack, infrastructure, and tools while actively engaging in hands-on engineering efforts in the creation and deployment of new data architectures and workflows. - Set coding standards and best practices for the Data Engineering & Operations team, conducting and participating in code reviews to maintain high-quality, consistent code. - Work closely with database developers, software development, product management, and AI/ML developers to align data initiatives with Assent's organizational goals. - Collaborate with team members to monitor progress, adjust priorities, and meet project deadlines and objectives. - Identify opportunities for internal process improvements, including automating manual processes and optimizing data delivery. - Proactively support peers in continuous learning by providing technical guidance and training on data engineering development, analysis, and execution. Qualifications: Your Knowledge, Skills, and Abilities: We strongly value your talent, energy, and passion. It will be valuable to Assent if you have the following qualifications: - Degree in a related field and 7+ years of data engineering experience. - Proficiency in tools and languages, such as AWS, dbt, Snowflake, Git, R, Python, SQL, SQL Server, and Snowflake. - Strong project management skills and the ability to communicate complex concepts effectively. - Strong analytical skills with the ability to collect, organize, analyze, and disseminate significant amounts of information with attention to detail and accuracy. - Adept at conveying complex data insights through a robust understanding of data management systems, warehouse methodologies, data quality standards, data modeling techniques, governance protocols, and advanced analytics. - Familiarity with agile work environments and scrum ceremonies. - Strong business acumen and experience in aligning data initiatives with business objectives.,
Posted 2 weeks ago
3.0 - 7.0 years
0 Lacs
hyderabad, telangana
On-site
You have a total of 4-6 years of development/design experience with a minimum of 3 years experience in Big Data technologies on-prem and on cloud. You should be proficient in Snowflake and possess strong SQL programming skills. Your role will require strong experience with data modeling and schema design, as well as extensive experience in using Data warehousing tools like Snowflake/BigQuery/RedShift and BI Tools like Tableau/QuickSight/PowerBI (at least one must be a must-have). You must also have experience with orchestration tools like Airflow and transformation tool DBT. Your responsibilities will include implementing ETL/ELT processes and building data pipelines, workflow management, job scheduling, and monitoring. You should have a good understanding of Data Governance, Security and Compliance, Data Quality, Metadata Management, Master Data Management, Data Catalog, as well as cloud services (AWS), including IAM and log analytics. Excellent interpersonal and teamwork skills are essential, along with the experience of leading and mentoring other team members. Good knowledge of Agile Scrum and communication skills are also required. At GlobalLogic, the culture prioritizes caring and inclusivity. Youll join an environment where people come first, fostering meaningful connections with collaborative teammates, supportive managers, and compassionate leaders. Continuous learning and development opportunities are provided to help you grow personally and professionally. Meaningful work awaits you at GlobalLogic, where youll have the chance to work on impactful projects and engage your curiosity and problem-solving skills. The organization values balance and flexibility, offering various career areas, roles, and work arrangements to help you achieve a perfect balance between work and life. GlobalLogic is a high-trust organization where integrity is key, ensuring a safe, reliable, and ethical global environment for all employees. Truthfulness, candor, and integrity are fundamental values upheld in everything GlobalLogic does. GlobalLogic, a Hitachi Group Company, is a trusted digital engineering partner that collaborates with the world's largest and most forward-thinking companies. Leading the digital revolution since 2000, GlobalLogic helps create innovative digital products and experiences, transforming businesses and redefining industries through intelligent products, platforms, and services.,
Posted 2 weeks ago
10.0 - 14.0 years
0 Lacs
maharashtra
On-site
As the Manager, Data Platform Engineering at Assent, you will play a crucial role in leading and coaching a team of data platform engineers. Your primary responsibility will be to keep your team organized, motivated, and engaged in delivering high-quality, scalable software solutions that address key business challenges. By recruiting, developing, and retaining a high-performing team, you will ensure that Assent's products meet the needs of its customers and align with the company's business objectives. Your role will involve coordinating and allocating work among the team members, driving delivery, and championing product quality. You will collaborate closely with other teams in the product value chain such as Product Management, User Experience, Quality Assurance, Customer Success, and Infrastructure. Additionally, you will visualize upcoming work, manage product releases, and ensure that the software developed follows Assent's guidelines and standards. To excel in this role, you should have at least 10 years of progressive experience in a data-focused role, with a proven track record in a leadership position. Strong mentoring and coaching skills are essential to keep your team engaged and motivated. A solid technical background, familiarity with AWS, Snowflake, dbt, Python, SQL, Kafka, and experience in working with large volumes of unstructured data are also required. As a strategic and business-minded individual, you should possess strong analytical skills and be able to manage short-term projects as well as long-term strategies effectively. Adaptability, flexibility, and a growth mindset are key attributes that will help you thrive in Assent's dynamic environment. Your contributions will not only impact the success of Assent but will also contribute to addressing global challenges related to supply chain sustainability. At Assent, we value your talent, energy, and passion. In addition to competitive financial benefits, we offer wellness programs, flexible work options, volunteer opportunities, and a commitment to diversity, equity, and inclusion. Join us in our mission to create a sustainable future and be a part of a team that values inclusivity, respect, and continuous learning.,
Posted 2 weeks ago
2.0 - 6.0 years
0 Lacs
maharashtra
On-site
About the Role: As a Machine Learning Operation Engineer at Onclusive, your primary responsibility will be to deploy, scale, and optimize backend algorithms, robust data ingestion pipelines, machine learning services, and data platforms to handle vast amounts of text and analytics data. You will leverage your technical expertise and Big Data analytics skills to address complex marketing challenges using Onclusive's extensive online content data. The role of an ML Ops Engineer is crucial to the success of Onclusive. Your responsibilities include designing and constructing scalable machine learning services and data platforms, utilizing benchmarks and metrics to enhance services, overseeing a system processing tens of millions of jobs daily, researching and implementing cutting-edge algorithms for data analysis, collaborating with data scientists and ML engineers on implementing ML, AI, and NLP techniques, and managing inference services on autoscaling fleets with GPUs and specialized hardware. Who you are: You hold a degree (BS, MS, or Ph.D.) in Computer Science or a related field, supported by practical experience. You should have a minimum of 2 years of experience with Kubernetes and/or Terraform, proficiency in Python with a solid grasp of Object-Oriented Programming principles, knowledge of containerization (preferably Docker), experience in Infrastructure as Code (IAC) for AWS (preferably Terraform), familiarity with Version Control Systems (particularly Git and GitHub) and CI/CD (preferably GitHub Actions), understanding of release management with a focus on testing and quality assurance, good grasp of ML principles, and desirable experience in Data Engineering tools like airflow, dbt, and meltano. Exposure to deep learning tech-stacks such as Torch, Tensorflow, and Transformers is a plus. What we can offer: Joining a rapidly growing global company, you will have the opportunity to enhance your skills and advance your career. In return for your contributions, we provide a competitive salary and benefits, a hybrid working environment within a team passionate about their work and supporting each other's development, and a company culture that prioritizes wellbeing and work-life balance through initiatives like flexible working arrangements and mental health support.,
Posted 2 weeks ago
2.0 - 6.0 years
0 Lacs
karnataka
On-site
NTT DATA is looking for an experienced Data Engineer-Offshore to join their team in Bangalore, Karnataka, India. As a Data Engineer, you will be responsible for designing and implementing tailored data solutions to meet customer needs across various technical platforms. You will be working with languages such as Qlik and Python to build and deploy data replication solutions using Qlik. Collaboration across different technical stacks like Snowflake, AWS, Oracle, dbt, and SQL is essential. You will also need to generate comprehensive solution documentation and adhere to Agile practices throughout the development process. The ideal candidate should have a minimum of 3+ years of experience in supporting Software Engineering, Data Engineering, or Data Analytics projects, along with 2+ years of experience leading a team in data-related projects. It is preferred that you have demonstrated production experience in core data platforms like QLIK, Snowflake, AWS, dbt, SQL, and possess a strong understanding of Data integration technologies. Additionally, excellent written and verbal communication skills are required to effectively convey complex technical concepts. An Undergraduate or Graduate degree is preferred. NTT DATA is a trusted global innovator of business and technology services, serving 75% of the Fortune Global 100. With a commitment to helping clients innovate, optimize, and transform for long-term success, NTT DATA has a diverse team of experts in over 50 countries and a strong partner ecosystem. Their services range from business and technology consulting to data and artificial intelligence solutions, as well as the development, implementation, and management of applications, infrastructure, and connectivity. As a leading provider of digital and AI infrastructure globally, NTT DATA is part of the NTT Group, investing significantly in R&D each year to support organizations and society in moving confidently into the digital future. Visit us at us.nttdata.com.,
Posted 2 weeks ago
5.0 - 12.0 years
0 Lacs
chennai, tamil nadu
On-site
You are a skilled Snowflake Developer with a strong background in Data Warehousing (DWH), SQL, Informatica, Power BI, and related tools. With over 5 years of experience, you will be responsible for designing, developing, and maintaining data pipelines, integrating data across multiple platforms, and optimizing large-scale data architectures. Your expertise will contribute to efficient ELT processes using Snowflake, Fivetran, and DBT for data integration and pipeline development. You will write complex, scalable SQL queries, including stored procedures, to support data transformation, reporting, and analysis. Additionally, you will implement advanced data modeling techniques, such as Slowly Changing Dimensions (SCD Type-2), using DBT, and design high-performance data architectures. Collaboration with business stakeholders to understand data needs, troubleshooting data-related issues, ensuring high data quality standards, and documenting data processes will be part of your responsibilities. Your qualifications include expertise in Snowflake for data warehousing and ELT processes, strong proficiency in SQL for relational databases, experience with Informatica PowerCenter for data integration and ETL development, and familiarity with tools like Power BI for data visualization, Fivetran for automated ELT pipelines, and Sigma Computing, Tableau, Oracle, and DBT. You possess strong data analysis, requirement gathering, and mapping skills and are familiar with cloud services such as Azure, AWS, or GCP, along with workflow management tools like Airflow, Azkaban, or Luigi. Proficiency in Python for data processing is required, and knowledge of other languages like Java and Scala is a plus. You hold a graduate degree in Computer Science, Statistics, Informatics, Information Systems, or a related quantitative field. Your skills include data modeling, business intelligence, Python, DBT, performance BI, ETL, DWH, Fivetran, data quality, Snowflake, SQL, and more. This is an exciting opportunity to work with cutting-edge technologies in a collaborative environment and contribute to building scalable, high-performance data solutions.,
Posted 2 weeks ago
10.0 - 14.0 years
0 Lacs
maharashtra
On-site
As the Technical Lead of Data Engineering at Assent, you will collaborate with various stakeholders including Product Managers, Product Designers, and Engineering team members to identify opportunities and evaluate the feasibility of solutions. Your role will involve offering technical guidance, influencing decision-making, and aligning data engineering initiatives with business objectives as part of Assent's roadmap development. You will be responsible for driving the technical strategy, overseeing team execution, and implementing process improvements to construct resilient and scalable data systems. In addition, you will lead data engineering efforts, mentor a growing team, and establish robust and scalable data infrastructure. Key Requirements & Responsibilities: - Lead the technical execution of data engineering projects to ensure high-quality and timely delivery, covering discovery, delivery, and adoption stages. - Collaborate with Architecture team members to design and implement scalable, high-performance data pipelines and infrastructure. - Provide technical guidance to the team, ensuring adherence to best practices in data engineering, performance optimization, and system reliability. - Work cross-functionally with various teams such as Product Managers, Software Development, Analysts, and AI/ML teams to define and implement data initiatives. - Partner with the team manager to plan and prioritize work, striking a balance between short-term deliverables and long-term technical enhancements. - Keep abreast of emerging technologies and methodologies, advocating for their adoption to accelerate the team's objectives. - Ensure compliance with corporate security policies and follow the established guidelines and procedures of Assent. Qualifications: Your Knowledge, Skills and Abilities: - Possess 10+ years of experience in data engineering, software development, or related fields. - Proficient in cloud data platforms, particularly AWS. - Expertise in modern data technologies like Spark, Airflow, dbt, Snowflake, Redshift, or similar. - Deep understanding of distributed systems and data pipeline design, with specialization in ETL/ELT processes, data warehousing, and real-time streaming. - Strong programming skills in Python, SQL, Scala, or similar languages. - Experience with infrastructure as code tools like Terraform, CloudFormation, and knowledge of DevOps best practices. - Ability to influence technical direction and promote best practices across teams. - Excellent communication and leadership skills, with a focus on fostering collaboration and technical excellence. - A learning mindset, continuously exploring new technologies and best practices. - Experience in security, compliance, and governance related to data systems is a plus. This is not an exhaustive list of duties, and responsibilities may be modified or added as needed to meet business requirements. Life at Assent: At Assent, we are dedicated to cultivating an inclusive environment where team members feel valued, respected, and heard. Our diversity, equity, and inclusion practices are guided by our Diversity and Inclusion Working Group and Employee Resource Groups (ERGs), ensuring that team members from diverse backgrounds are recruited, retained, and provided opportunities to contribute to business success. If you need assistance or accommodation during any stage of the interview and selection process, please reach out to talent@assent.com, and we will be happy to assist you.,
Posted 2 weeks ago
5.0 - 9.0 years
0 Lacs
pune, maharashtra
On-site
As a Senior Data Engineer at Ethoca, a Mastercard Company in Pune, India, you will play a crucial role in driving data enablement and exploring big data solutions within our technology landscape. Your responsibilities will include designing, developing, and optimizing batch and real-time data pipelines using tools such as Snowflake, Snowpark, Python, and PySpark. You will also be involved in building data transformation workflows, implementing CI/CD pipelines, and administering the Snowflake platform to ensure performance tuning, access management, and platform scalability. Collaboration with stakeholders to understand data requirements and deliver reliable data solutions will be a key part of your role. Your expertise in cloud-based database infrastructure, SQL development, and building scalable data models using tools like Power BI will be essential in supporting business analytics and dashboarding. Additionally, you will be responsible for real-time data streaming pipelines, data observability practices, and planning and executing deployments, migrations, and upgrades across data platforms while minimizing service impacts. To be successful in this role, you should have a strong background in computer science or software engineering, along with deep hands-on experience with Snowflake, Snowpark, Python, PySpark, and CI/CD tooling. Familiarity with Schema Change, Java JDK, Spring & Springboot framework, Databricks, and real-time data processing is desirable. You should also possess excellent problem-solving and analytical skills, as well as effective written and verbal communication abilities for collaborating across technical and non-technical teams. You will be part of a high-performing team that is committed to making systems resilient and easily maintainable on the cloud. If you are looking for a challenging role that allows you to leverage cutting-edge software and development skills while working with massive data volumes, this position at Ethoca may be the right fit for you.,
Posted 2 weeks ago
10.0 - 14.0 years
0 Lacs
maharashtra
On-site
As the Manager, Data Engineering at Assent, you will play a crucial role in leading a team of specialists in data engineering to bring key products and features to life for Assent and its customers. Your responsibilities will include recruiting, developing, and retaining a high-performing and engaged team, coaching them to deliver high-quality, scalable software that addresses key business problems. You will be responsible for coordinating and allocating work among the team, driving delivery, and championing product quality. Your role will involve establishing and maintaining rapport with other teams in the product value chain such as Product Management, User Experience, Quality Assurance, Customer Success, and Infrastructure. You will visualize upcoming work, manage product releases by monitoring progress and making necessary adjustments to meet delivery schedule and requirements. Additionally, you will ensure that the team develops software that follows Assent's Design Guidelines, Coding Standards, Security Guidelines, and any other relevant guidelines provided by the Architecture team. To excel in this role, you should have at least 10 years of progressive experience in a data-focused role with a proven track record in a leadership position. Strong mentoring and coaching skills are essential to keep your team engaged and motivated. A strong technical background with experience in AWS, Snowflake, dbt, Python, SQL, and handling large volumes of unstructured data is required. You should possess a growth mindset, be highly organized, adaptable, and strategic in your approach to business challenges. At Assent, we value your talent, energy, and passion. In addition to competitive financial benefits, we offer wellness programs, flexible work options, professional development opportunities, and a commitment to diversity, equity, and inclusion. Join us in our mission to create a sustainable future and make a global impact with your expertise in data engineering.,
Posted 2 weeks ago
6.0 - 10.0 years
0 Lacs
chennai, tamil nadu
On-site
As a Cloud Data Architect specializing in BigQuery and CloudSQL at our Chennai office, you will play a crucial role in leading the design and implementation of scalable, secure, and high-performing data architectures using Google Cloud technologies. Your expertise will be essential in shaping architectural direction and ensuring that data solutions meet enterprise-grade standards. Your responsibilities will include designing data architectures that align with performance, cost-efficiency, and scalability needs, implementing data models, security controls, and access policies across GCP platforms, leading cloud database selection, schema design, and tuning for analytical and transactional workloads, collaborating with DevOps and DataOps teams to deploy and manage data environments, ensuring best practices for data governance, cataloging, and versioning, and enabling real-time and batch integrations using GCP-native tools. To excel in this role, you must possess deep knowledge of BigQuery, CloudSQL, and the GCP data ecosystem, along with strong experience in schema design, partitioning, clustering, and materialized views. Hands-on experience in implementing data encryption, IAM policies, and VPC configurations is crucial, as well as an understanding of hybrid and multi-cloud data architecture strategies and data lifecycle management. Proficiency in GCP cost optimization is also required. Preferred skills for this role include experience with AlloyDB, Firebase, or Spanner, familiarity with LookML, dbt, or DAG-based orchestration tools, and exposure to the BFSI domain or financial services architecture. In addition to technical skills, soft skills such as visionary thinking with practical implementation skills, strong communication, and cross-functional leadership are highly valued. Previous experience guiding data strategy in enterprise settings will be advantageous. Joining our team will give you the opportunity to own data architecture initiatives in a cloud-native ecosystem, drive innovation through scalable and secure GCP designs, and collaborate with forward-thinking data and engineering teams. Skills required for this role include IAM policies, Spanner, cloud, schema design, data architecture, GCP data ecosystem, dbt, GCP cost optimization, data, AlloyDB, data encryption, data lifecycle management, BigQuery, LookML, VPC configurations, partitioning, clustering, materialized views, DAG-based orchestration tools, Firebase, and CloudSQL.,
Posted 2 weeks ago
0.0 years
0 Lacs
Bengaluru, Karnataka, India
On-site
Ready to build the future with AI At Genpact, we don&rsquot just keep up with technology&mdashwe set the pace. AI and digital innovation are redefining industries, and we&rsquore leading the charge. Genpact&rsquos AI Gigafactory, our industry-first accelerator, is an example of how we&rsquore scaling advanced technology solutions to help global enterprises work smarter, grow faster, and transform at scale. From large-scale models to agentic AI, our breakthrough solutions tackle companies most complex challenges. If you thrive in a fast-moving, innovation-driven environment, love building and deploying cutting-edge AI solutions, and want to push the boundaries of what&rsquos possible, this is your moment. Genpact (NYSE: G) is an advanced technology services and solutions company that delivers lasting value for leading enterprises globally. Through our deep business knowledge, operational excellence, and cutting-edge solutions - we help companies across industries get ahead and stay ahead. Powered by curiosity, courage, and innovation, our teams implement data, technology, and AI to create tomorrow, today. Get to know us at genpact.com and on LinkedIn, X, YouTube, and Facebook. Inviting applications for the role of Principal Consultant, Data Engineer In this role, you will be responsible for coding, testing and delivering high quality deliverables, and should be willing to learn new technologies . Responsibilities Identify , design, and implement internal process improvements: automating manual processes, optimizing data delivery, re-designing infrastructure for greater scalability, etc. Build the infrastructure required for optimal extraction, transformation, and loading of data from a wide variety of data sources Work with stakeholders to assist with data-related technical issues and support their data infrastructure needs. Generate, present, and develop ideas for progressing the data environment, such as common frameworks or common methodologies, which the IM IT division will then use. Possess technical curiosity to explore new features in existing tools and technologies as well as explore new methodologies, features and tools that can be adopted by the IM IT division. Share, instruct, and coach the Investment Technology division on data topics such as best practices, design methodologies, and query optimization. Organize, liaison, and work with other development teams to onboard applications and processes onto the new data architecture (cloud technologies, replication, etc.). Desire to learn and become the subject matter expert on tools and technologies. Act as a liaison to the various development teams and other data teams to market and proliferate the data architecture doctrines, principles and standards. Help devise and implement pragmatic data governance principles and methodology . Perform detailed data analysis and validation. Assist with preparation, coordination, and execution of User Acceptance Testing. Qualifications we seek in you! Minimum Qualifications BE/B Tech/MCA Excellent written and verbal communication skills Preferred Qualifications/ Skills Python, SQL, Spark/ PySpark, AWS Glue, AWS Aurora (preferably Postgres), AWS S3, dbt . Strong relational database design. Experience with multi-temporal data. Great Expectations, Jasper Reports, workflow experience (BPMN), .NET Excellent verbal and written skills. Ability to work in a team environment. Ability to work effectively and efficiently with supervision. Capable of managing multiple tasks with tight time deadlines. Possess strong analytical ability and excellent attention to detail. Maintain a strong commitment to quality. Strong Excel skills. Strong analytical skills. Tools/Technologies used: SQL, SQL Server, ETL tools, Autosys, Snowflake/Cloud/Azure experience a plus Why join Genpact . Lead AI-first transformation - Build and scale AI solutions that redefine industries . Make an impact - Drive change for global enterprises and solve business challenges that matter . Accelerate your career&mdashGain hands-on experience, world-class training, mentorship, and AI certifications to advance your skills . Grow with the best - Learn from top engineers, data scientists, and AI experts in a dynamic, fast-moving workplace . Committed to ethical AI - Work in an environment where governance, transparency, and security are at the core of everything we build . Thrive in a values-driven culture - Our courage, curiosity, and incisiveness - built on a foundation of integrity and inclusion - allow your ideas to fuel progress Come join the 140,000+ coders, tech shapers, and growth makers at Genpact and take your career in the only direction that matters: Up. Let&rsquos build tomorrow together. Genpact is an Equal Opportunity Employer and considers applicants for all positions without regard to race, color, religion or belief, sex, age, national origin, citizenship status, marital status, military/veteran status, genetic information, sexual orientation, gender identity, physical or mental disability or any other characteristic protected by applicable laws. Genpact is committed to creating a dynamic work environment that values respect and integrity, customer focus, and innovation. Furthermore, please do note that Genpact does not charge fees to process job applications and applicants are not required to pay to participate in our hiring process in any other way. Examples of such scams include purchasing a %27starter kit,%27 paying to apply, or purchasing equipment or training.
Posted 2 weeks ago
Upload Resume
Drag or click to upload
Your data is secure with us, protected by advanced encryption.
Browse through a variety of job opportunities tailored to your skills and preferences. Filter by location, experience, salary, and more to find your perfect fit.
We have sent an OTP to your contact. Please enter it below to verify.
Accenture
39581 Jobs | Dublin
Wipro
19070 Jobs | Bengaluru
Accenture in India
14409 Jobs | Dublin 2
EY
14248 Jobs | London
Uplers
10536 Jobs | Ahmedabad
Amazon
10262 Jobs | Seattle,WA
IBM
9120 Jobs | Armonk
Oracle
8925 Jobs | Redwood City
Capgemini
7500 Jobs | Paris,France
Virtusa
7132 Jobs | Southborough