Home
Jobs
Companies
Resume

13124 Etl Jobs - Page 5

Filter
Filter Interviews
Min: 0 years
Max: 25 years
Min: ₹0
Max: ₹10000000
Setup a job Alert
JobPe aggregates results for easy application access, but you actually apply on the job portal directly.

5.0 - 7.0 years

3 - 4 Lacs

Chennai

On-site

Overview: 5 to 7 years in experience , Full time WFO Hands on development experience in ETL using ODI 11G/12C Oracle SQL and PL/SQL programming experience OR Hands on development experience in ETL IICS Proficient in Data migration technique and Data Integration Oracle SQL and PL/SQL programming experience Experience in Data Warehouse and/or Data Marts Qualifications: B.E or Any Qualification Essential skills: Hands on development experience in ETL using ODI 11G/12C Oracle SQL and PL/SQL programming experience Proficiency in warehousing architecture techniques Experience in Data Warehouse and/or Data Marts Good communication skills and should be self-sufficient to collaborate with project teams Good to Have Experience in database modeling – Enterprise Data Warehouse Exposure to any other ETL tool like Informatica MYSQL or SQL Server hands-on

Posted 22 hours ago

Apply

3.0 - 5.0 years

0 Lacs

Hyderabad, Telangana, India

On-site

Linkedin logo

About Us: Arrise Solutions India Pvt. Ltd. (powering PragmaticPlay) is a leading content provider to the iGaming and Betting Industry, offering a multi-product portfolio that is innovative, regulated and mobile-focused. Pragmatic Play strives to create the most engaging and evocative experience for customers globally across a range of products, including slots, live casino, sports betting, virtual sports and bingo. Driven by a persistence to craft immersive experiences and responsible thrills, our professional team consistently deliver best-in-class services with a dedication to create games that players love time and time again. Key Responsibilities: We are looking for an experienced MS SQL Database Developer with BTech/BE/MCA with 3-5 years of total experience. Designing database tables. Experience On PostgreSQL Is Preferred. Creating views, functions and stored procedures. Fine tuning existing queries, procs, functions to improve performance. Creating and managing table partitions and indexes. Experience in ETL is preferable. Communication, teamwork and negotiation skills. Understanding of information legislation, such as the Data Protection Act. The ability to work to tight deadlines under pressure. What We Offer: Driven by a persistence to craft immersive experiences and responsible thrills, our professional team consistently deliver best-in-class services with a dedication to create games that players love time and time again. Professional and personal development Opportunities to progress within a dynamic team. Close and collaborative colleagues Our Values: PERSISTENCE: We never give up and are determined to be the best at what we do. RESPECT: We value and respect our clients, their players, and our team members; promoting professionalism, integrity and fairness without compromise. OWNERSHIP: We take ownership of our work and consistently deliver in a reliable manner; always providing the highest level of quality. Show more Show less

Posted 22 hours ago

Apply

10.0 years

0 Lacs

Chennai, Tamil Nadu, India

On-site

Linkedin logo

Key Responsibilities Lead and architect end-to-end data migrations from on-premise and legacy systems to Snowflake, ensuring optimal performance, scalability, and cost-efficiency. Design and develop reusable data ingestion and transformation frameworks using Python. Build and optimize real-time ingestion pipelines using Kafka, Snowpipe, and the COPY command. Utilize SnowConvert to migrate and optimize legacy ETL and SQL logic for Snowflake. Design and implement high-performance Snowflake data models, including materialized views, clustering keys, and result caching strategies. Monitor resource usage and implement auto-suspend/auto-resume, query profiling, and cost-control measures to manage compute and storage effectively. Drive cost governance initiatives, providing insights into credit usage and optimizing workload distribution. Integrate Snowflake with AWS services such as S3, Lambda, Glue, and Step Functions to ensure a robust data ecosystem. Mentor junior engineers, enforce best practices in development and code quality, and champion agile data engineering practices. ________________________________________ Required Skills And Experience 10+ years of experience in data engineering with a focus on enterprise ETL and cloud data platforms. 4+ years of hands-on experience in Snowflake development and architecture. Expertise In Advanced Snowflake Features Such As Snowpark, Streams & Tasks, Secure Data Sharing, Data Masking, and Time Travel. Proven ability to architect enterprise-grade Snowflake solutions optimized for performance, governance, and scalability. Proficient in Python for building orchestration tools, automation, and reusable data pipelines. Solid knowledge of AWS services, including S3, IAM, Lambda, Glue, and Step Functions. Hands-on experience with SnowConvert or similar tools for legacy code conversion. Familiarity with real-time data streaming technologies such as Kafka, Kinesis, or other event-based systems. Strong SQL skills with proven experience in query tuning, profiling, and performance optimization. Deep understanding of legacy ETL tools, with preferable experience in Ab Initio. Exposure to CI/CD pipelines, version control systems (e.g., Git), and automated deployment practices. ________________________________________ Preferred Qualifications Bachelors degree in Computer Science, Information Technology, or a related field. Experience in migrating on-premises or mainframe data warehouses to Snowflake. Familiarity with BI/analytics tools such as Tableau, Power BI, or Looker. Knowledge of data security and compliance best practices, including data masking, RBAC, and OAuth integration. Snowflake certifications (Developer, Architect) are a strong plus. Show more Show less

Posted 22 hours ago

Apply

5.0 years

3 - 7 Lacs

Ahmedabad

On-site

Location: Ahmedabad / Pune Required Experience: 5+ Years Preferred Immediate Joiner We are looking for a highly skilled Lead Data Engineer (Snowflake) to join our team. The ideal candidate will have extensive experience Snowflake, and cloud platforms, with a strong understanding of ETL processes, data warehousing concepts, and programming languages. If you have a passion for working with large datasets, designing scalable database schemas, and solving complex data problems. Key Responsibilities: Design, implement, and optimize data pipelines and workflows using Apache Airflow Develop incremental and full-load strategies with monitoring, retries, and logging Build scalable data models and transformations in dbt, ensuring modularity, documentation, and test coverage Develop and maintain data warehouses in Snowflake Ensure data quality, integrity, and reliability through validation frameworks and automated testing Tune performance through clustering keys, warehouse scaling, materialized views, and query optimization. Monitor job performance and resolve data pipeline issues proactively Build and maintain data quality frameworks (null checks, type checks, threshold alerts). Partner with data analysts, scientists, and business stakeholders to translate reporting and analytics requirements into technical specifications. Required Skills & Qualifications: Snowflake (data modeling, performance tuning, access control, external tables, streams & tasks) Apache Airflow (DAG design, task dependencies, dynamic tasks, error handling) dbt (Data Build Tool) (modular SQL development, jinja templating, testing, documentation) Proficiency in SQL, Spark and Python Experience building data pipelines on cloud platforms like AWS, GCP, or Azure Strong knowledge of data warehousing concepts and ELT best practices Familiarity with version control systems (e.g., Git) and CI/CD practices Familiarity with infrastructure-as-code tools like Terraform for provisioning Snowflake or Airflow environments. Excellent problem-solving skills and the ability to work independently. Perks: Flexible Timings 5 Days Working Healthy Environment Celebration Learn and Grow Build the Community Medical Insurance Benefit

Posted 22 hours ago

Apply

0 years

0 Lacs

Chennai, Tamil Nadu, India

On-site

Linkedin logo

Key Responsibilities Develop and maintain data pipelines and ETL processes using Snowflake, Streams & Tasks, and Snowpipe. Leverage Snowpark to build scalable data transformations in Python. Implement Secure Data Sharing, Row-Level Security, and Dynamic Data Masking for governed data access. Create and manage materialized views, automatic clustering, and search optimization for performance tuning. Collaborate with data scientists, analysts, and DevOps teams to deliver end-to-end data solutions. Monitor query performance, troubleshoot issues, and recommend optimizations using Query Profile and Resource Monitors. ________________________________________ Required Technical Skills Strong expertise in Snowflake SQL and data modeling (Star/Snowflake schema). Hands-on with Snowpark, Streams/Tasks, and Secure Data Sharing. Proficiency in Python or Java for data processing with Snowpark. Experience with cloud platforms: AWS, Azure, or GCP (Snowflake hosted environments). Familiarity with CI/CD, Git, and orchestration tools (e.g., Airflow, DBT). Working knowledge of data governance, data security, and compliance best practices. ________________________________________ Qualifications Bachelors or Masters degree in Computer Science, Information Systems, or a related field. Snowflake certification is a strong plus (e.g., SnowPro Core or Advanced). Show more Show less

Posted 22 hours ago

Apply

5.0 - 7.0 years

0 Lacs

Chennai, Tamil Nadu, India

On-site

Linkedin logo

5 to 7 years in experience , Full time WFO Hands on development experience in ETL using ODI 11G/12C Oracle SQL and PL/SQL programming experience OR Hands on development experience in ETL IICS Proficient in Data migration technique and Data Integration Oracle SQL and PL/SQL programming experience Experience in Data Warehouse and/or Data Marts Qualifications B.E or Any Qualification Essential Skills Hands on development experience in ETL using ODI 11G/12C Oracle SQL and PL/SQL programming experience Proficiency in warehousing architecture techniques Experience in Data Warehouse and/or Data Marts Good communication skills and should be self-sufficient to collaborate with project teams Good to Have Experience in database modeling – Enterprise Data Warehouse Exposure to any other ETL tool like Informatica MYSQL or SQL Server hands-on Show more Show less

Posted 22 hours ago

Apply

0 years

0 Lacs

Chennai, Tamil Nadu, India

On-site

Linkedin logo

Key Responsibilities Develop and maintain data pipelines and ETL processes using Snowflake, Streams & Tasks, and Snowpipe. Leverage Snowpark to build scalable data transformations in Python. Implement Secure Data Sharing, Row-Level Security, and Dynamic Data Masking for governed data access. Create and manage materialized views, automatic clustering, and search optimization for performance tuning. Collaborate with data scientists, analysts, and DevOps teams to deliver end-to-end data solutions. Monitor query performance, troubleshoot issues, and recommend optimizations using Query Profile and Resource Monitors. ________________________________________ Required Technical Skills Strong expertise in Snowflake SQL and data modeling (Star/Snowflake schema). Hands-on with Snowpark, Streams/Tasks, and Secure Data Sharing. Proficiency in Python or Java for data processing with Snowpark. Experience with cloud platforms: AWS, Azure, or GCP (Snowflake hosted environments). Familiarity with CI/CD, Git, and orchestration tools (e.g., Airflow, DBT). Working knowledge of data governance, data security, and compliance best practices. ________________________________________ Qualifications Bachelors or Masters degree in Computer Science, Information Systems, or a related field. Snowflake certification is a strong plus (e.g., SnowPro Core or Advanced). Show more Show less

Posted 22 hours ago

Apply

8.0 years

0 Lacs

Chennai, Tamil Nadu, India

On-site

Linkedin logo

Required Skills And Experience 8+ years in IT operations, scheduling, and workflow automation using Control-M. Strong experience integrating Control-M with AWS cloud services. Hands-on experience working with enterprise ETL tools like Ab Initio or Informatica. Experience supporting data migration and orchestration involving modern cloud data platforms like Snowflake. Proficiency in Python scripting for automation and custom tooling around Control-M. Familiarity with real-time data streaming platforms such as Kafka or Kinesis. Solid understanding of job scheduling concepts, batch processing, and event-driven automation. Experience with CI/CD pipelines, Git, and automation of deployment workflows. Strong troubleshooting, root cause analysis, and incident resolution skills. ________________________________________ Preferred Qualifications Bachelors degree in Computer Science, IT, or related field. Experience managing large-scale Control-M environments in enterprise settings. Knowledge of cloud data architecture and modern data engineering practices. Familiarity with Snowflake features and cloud data warehousing concepts. Certification in Control-M Administration or related scheduling tools is a plus. Show more Show less

Posted 22 hours ago

Apply

5.0 years

0 Lacs

Chennai, Tamil Nadu, India

On-site

Linkedin logo

We are seeking a detail-oriented Data Test Engineer to join our data migration and cloud modernization team. The ideal candidate will have hands-on experience testing complex ETL pipelines, data migration workflows, and cloud data platforms like Snowflake, with exposure to legacy ETL tools such as Ab Initio or Informatica. Experience in automating data validation, performance testing, and supporting real-time ingestion using Kafka or similar technologies is essential. ________________________________________ Key Responsibilities Design, develop, and execute test plans for data migration projects moving data from legacy systems to Snowflake. Validate data pipelines developed using ETL tools like Ab Initio and Informatica, ensuring data quality, accuracy, and integrity. Develop automated test scripts and frameworks using Python for data validation, reconciliation, and regression testing. Perform end-to-end data validation including schema validation, volume checks, transformation logic verification, and performance benchmarking. Test real-time data ingestion workflows integrating Kafka, Snowpipe, and Snowflake COPY commands. Collaborate closely with development, data engineering, and DevOps teams to identify defects, track issues, and ensure timely resolution. Participate in designing reusable test automation frameworks tailored for cloud data platforms. Ensure compliance with data governance, security, and regulatory requirements during testing. Document test cases, results, and provide clear reporting to stakeholders. Support CI/CD pipelines by integrating automated testing into the deployment workflow. ________________________________________ Required Skills And Experience 5+ years in data testing or quality assurance with strong experience in data validation and ETL testing. Hands-on experience testing data migrations to Snowflake or other cloud data warehouses. Familiarity with legacy ETL tools like Ab Initio or Informatica and their testing methodologies. Proficient in scripting languages such as Python for test automation and data validation. Knowledge of real-time data streaming platforms such as Kafka, Kinesis, or equivalents. Strong SQL skills for writing complex queries to validate data integrity and transformations. Experience with automated testing tools and frameworks for data quality checks. Understanding of cloud environments, particularly AWS services (S3, Lambda, Glue). Familiarity with CI/CD tools and practices to integrate automated testing. ________________________________________ Preferred Qualifications Bachelors degree in Computer Science, Information Technology, or related field. Experience with performance and load testing of data pipelines. Knowledge of data governance and compliance frameworks. Exposure to BI tools such as Tableau, Power BI for validating data consumption layers. Certifications in data quality or cloud platforms (Snowflake, AWS) are a plus Show more Show less

Posted 22 hours ago

Apply

10.0 years

0 Lacs

Chennai, Tamil Nadu, India

On-site

Linkedin logo

Role: Tech Lead – ETL Location: Offshore/India Who are we looking for? We are looking for 10+ Years of Tech Lead – ETL; multi skilled in different technology mentioned below. Tech Stack: Data Integration Talend, EMR, Airflow DataStage Hue Apache TomCat Python Visualization and Reporting Cognos PowerBI Tableau QlikView MicroStrategy Sigma Monitoring/Scheduler Tools App Dynamics Data Power Operational Dashboard (DPOD) Microsoft System Center Orchestrator (IR Batch Scheduler) AutoSys AWS CloudWatch Technical Skills: Strong knowledge with working experience in one from each category mentioned above. Key Responsibilities: Should have knowledge in extracting data from heterogeneous sources such as databases, CRM, flat files, etc. Create and design the ETL processes and Pipelines Ensure Data quality, modeling and tuning Strong troubleshooting and Problem Solving skills to identify the issues quickly; provide innovative tools for that. Collaboration: Working with developers, other administrators, and stakeholders to ensure smooth operations and integration. Continuous Improvement: Staying up to date with the latest industry trends and technologies to drive continuous improvement and innovation. DevOps and Agile Practices: Aligning middleware operations with DevOps and Agile principles and contributing to automation of middleware-related tasks. Qualification: · Education qualification: Any Graduate Show more Show less

Posted 22 hours ago

Apply

7.0 years

0 Lacs

Gurugram, Haryana, India

On-site

Linkedin logo

Must have Skills: Power BI, dundas bi, tableau, cognos •Bachelor's degree in Computer Science, Information Systems, Engineering, or a related field. •7+ years of experience as a Report Writer, BI Developer, or SQL Developer. •Advanced proficiency in SQL (MySQL, PostgreSQL, or similar RDBMS). •Experience developing and maintaining reports using BI tools such as Dundas BI, Power BI, Tableau, or Cognos. •Strong knowledge of data modeling techniques and relational database design. •Familiarity with ETL processes, data warehousing concepts, and performance tuning. •Exposure to cloud platforms (Azure, AWS) is a plus. •Experience working in Agile/Scrum environments. •Strong analytical and problem-solving skills •Excellent communication skills and ability to work in a team environment. Show more Show less

Posted 22 hours ago

Apply

0 years

0 Lacs

India

Remote

Linkedin logo

Future-Able is looking for a Data Engineer, a full-time contract role, to work for: Naked & Thriving - An organic, botanical skincare brand committed to creating high-performing, naturally derived products that are as kind to the planet as they are to your skin. Our mission is to empower individuals to embrace sustainable self-care while nurturing their natural beauty. Every product we craft reflects our dedication to sustainability, quality, and transparency, ensuring our customers feel confident with every choice they make. As we rapidly grow and expand into new categories, channels, and countries, customer satisfaction remains our top priority. Job Summary: We are seeking a Data Engineer with expertise in Python, exposure to AI & Machine Learning, and a strong understanding of eCommerce analytics to design, develop, and optimize data pipelines. The ideal candidate will work on Google Cloud infrastructure, enabling advanced insights using Google Analytics (GA4). What You Will Do: ● Develop & maintain scalable data pipelines to support analytics and AI-driven models. ● Work with Python (or equivalent programming language) for data processing and transformation. ● Implement AI & Machine Learning techniques for predictive analytics and automation. ● Optimize eCommerce data insights using GA4 and Google Analytics to drive business decisions. ● Build cloud-based data infrastructure leveraging Google Cloud services like BigQuery, Pub/Sub, and Dataflow. ● Ensure data integrity and governance across structured and unstructured datasets. ● Collaborate with cross-functional teams including product managers, analysts, and marketing professionals. ● Monitor & troubleshoot data pipelines to ensure smooth operation and performance. We are looking for: ● Proficiency in Python or a similar language (e.g., Scala). ● Experience with eCommerce analytics and tracking frameworks. ● Expertise in Google Analytics & GA4 for data-driven insights. ● Knowledge of Google Cloud Platform (GCP), including BigQuery, Cloud Functions, and Dataflow. ● Experience in designing, building, and optimizing data pipelines using ETL frameworks. ● Familiarity with data warehousing concepts and SQL-based query optimization. ● Strong problem-solving and communication skills in a fast-paced environment. What will make you stand out: ● Experience with event-driven architecture for real-time data processing. ● Understanding of marketing analytics and attribution modeling. ● Previous work in a high-growth eCommerce environment. ● Exposure of AI & Machine Learning concepts and model deployment. Benefits: ● USD Salary. ● Fully Remote Work. ● USD 50 for health insurance payment. ● 30 days of pay time off per year. ● The possibility of being selected for annual bonuses based on business performance and personal achievements. Show more Show less

Posted 22 hours ago

Apply

3.0 years

0 Lacs

Calcutta

On-site

Line of Service Advisory Industry/Sector Not Applicable Specialism Microsoft Management Level Senior Associate Job Description & Summary A career in our Microsoft Dynamics team will provide the opportunity to help our clients transform their technology landscape across Front, Back and Mid-Office functions leveraging Microsoft Dynamics. We focus on contributing to PwC’s value proposition of “strategy led and technology enabled”, by aligning our Consulting Solutions’ industry focus with the Microsoft technologies such as Dynamics 365, Azure, Power Platform and Power BI. Why PWC At PwC, you will be part of a vibrant community of solvers that leads with trust and creates distinctive outcomes for our clients and communities. This purpose-led and values-driven work, powered by technology in an environment that drives innovation, will enable you to make a tangible impact in the real world. We reward your contributions, support your wellbeing, and offer inclusive benefits, flexibility programmes and mentorship that will help you thrive in work and life. Together, we grow, learn, care, collaborate, and create a future of infinite experiences for each other. Learn more about us . At PwC, we believe in providing equal employment opportunities, without any discrimination on the grounds of gender, ethnic background, age, disability, marital status, sexual orientation, pregnancy, gender identity or expression, religion or other beliefs, perceived differences and status protected by law. We strive to create an environment where each one of our people can bring their true selves and contribute to their personal growth and the firm’s growth. To enable this, we have zero tolerance for any discrimination and harassment based on the above considerations. " Responsibilities: · Experience as a Data Analyst with high proficiency; · Expertise in writing and optimizing SQL Queries in SQL Server and/or Oracle; · Experience in Data Extraction, Transformation and Loading (ETL) using SSIS and ADF; · Experience in PowerBI and/or Tableau for visualizing and analyzing data; · Having knowledge in Database Normalization for optimum performance; · Excellency in MS Excel with proficiency in Vlookups, Pivot Tables and VBA Macros; · Knowledge about data warehousing concepts · Performance optimization and troubleshooting capabilities · Good Project Management Skills- Client Meetings, Stakeholder Engagement · Familiarity with Agile Methodology · Strong knowledge in Azure DevOps Boards, Sprint, Queries, Pipelines (CI/ CD) etc. Mandatory skill sets: ADF, Power BI Preferred skill sets: Devops/CI/CD Years of experience required: 3-7 years Education qualification: B.Tech/B.E Education (if blank, degree and/or field of study not specified) Degrees/Field of Study required: Bachelor of Engineering Degrees/Field of Study preferred: Certifications (if blank, certifications not specified) Required Skills Power BI Optional Skills Acceptance Test Driven Development (ATDD), Acceptance Test Driven Development (ATDD), Accepting Feedback, Active Listening, Analytical Thinking, Android, API Management, Appian (Platform), Application Development, Application Frameworks, Application Lifecycle Management, Application Software, Business Process Improvement, Business Process Management (BPM), Business Requirements Analysis, C#.NET, C++ Programming Language, Client Management, Code Review, Coding Standards, Communication, Computer Engineering, Computer Science, Continuous Integration/Continuous Delivery (CI/CD), Creativity {+ 46 more} Desired Languages (If blank, desired languages not specified) Travel Requirements Not Specified Available for Work Visa Sponsorship? No Government Clearance Required? No Job Posting End Date

Posted 22 hours ago

Apply

0 years

4 - 8 Lacs

Calcutta

On-site

Ready to shape the future of work? At Genpact, we don’t just adapt to change—we drive it. AI and digital innovation are redefining industries, and we’re leading the charge. Genpact’s AI Gigafactory , our industry-first accelerator, is an example of how we’re scaling advanced technology solutions to help global enterprises work smarter, grow faster, and transform at scale. From large-scale models to agentic AI , our breakthrough solutions tackle companies’ most complex challenges. If you thrive in a fast-moving, tech-driven environment, love solving real-world problems, and want to be part of a team that’s shaping the future, this is your moment. Genpact (NYSE: G) is an advanced technology services and solutions company that delivers lasting value for leading enterprises globally. Through our deep business knowledge, operational excellence, and cutting-edge solutions – we help companies across industries get ahead and stay ahead. Powered by curiosity, courage, and innovation , our teams implement data, technology, and AI to create tomorrow, today. Get to know us at genpact.com and on LinkedIn , X , YouTube , and Facebook . Inviting applications for the role of Principal Consultant- Sr. Snowflake Data Engineer ( Snowflake+ Python+Cloud ) ! In this role, the Sr. Snowflake Data Engineer is responsible for providing technical direction and lead a group of one or more developer to address a goal. Job Description : E xperience in IT industry W orking experience with building productionized data ingestion and processing data pipelines in Snowflake Strong understanding on Snowflake Architecture Fully well-versed with data warehousing concepts. Expertise and excellent understanding of Snowflake features and integration of Snowflake with other data processing. Able to create the data pipeline for ETL /ELT Good to have DBT experience Excellent presentation and communication skills, both written and verbal Ability to problem solve and architect in an environment with unclear requirements. Able to create the high level and low-level design document based on requirement. Hands on experience in configuration, troubleshooting, testing and managing data platforms, on premises or in the cloud. Awareness on data visualisation tools and methodologies Work independently on business problems and generate meaningful insights Good to have some experience/knowledge on Snowpark or Streamlit or GenAI but not mandatory. Should have experience on implementing Snowflake Best Practices Snowflake SnowPro Core Certification will be add ed an advantage Roles and Responsibilities : Requirement gathering, creating design document, providing solutions to customer, work with offshore team etc. Writing SQL queries against Snowflake , developing scripts to do Extract, Load, and Transform data. Hands-on experience with Snowflake utilities such as SnowSQL , Bulk copy, Snow p ipe , Tasks, Streams, Time travel, Cloning, Optimizer, Metadata Manager, data sharing, stored procedures and UDFs , Snowsight . Have experience with Snowflake cloud data warehouse and AWS S3 bucket or Azure blob storage container for integrating data from multiple source system . Should have have some exp on AWS services (S3, Glue, Lambda) or Azure services ( Blob Storage, ADLS gen2, ADF) Should have good experience in Python / Pyspark . integration with Snowflake and cloud (AWS/Azure) with ability to leverage cloud services for data processing and storage. Proficiency in Python programming language, including knowledge of data types, variables, functions, loops, conditionals, and other Python-specific concepts. Knowledge of ETL (Extract, Transform, Load) processes and tools, and ability to design and develop efficient ETL jobs using Python and Pyspark . Should have some experience on Snowflake RBAC and data security . Should have good experience in implementing CDC or SCD type - 2 . Should have good experience in implementing Snowflake Best Practices In-depth understanding of Data Warehouse, ETL concepts and Data Modelling Experience in requirement gathering, analys is, designing, development, and deployment . Should Have experience building data ingestion pipeline Optimize and tune data pipelines for performance and scalability Able to communicate with clients and lead team. Proficiency in working with Airflow or other workflow management tools for scheduling and managing ETL jobs. Good to have experience in deployment using CI/CD tools and exp in repositories like Azure repo , Github etc. Qualifications we seek in you! Minimum qualifications B.E./ Masters in Computer Science , Information technology, or Computer engineering or any equivalent degree with good IT experience and relevant as Senior Snowflake Data Engineer . Skill Metrix: Snowflake, Python/ PySpark , AWS/Azure, ETL concepts, Data Modeling & Data Warehousing concepts Why join Genpact? Be a transformation leader – Work at the cutting edge of AI, automation, and digital innovation Make an impact – Drive change for global enterprises and solve business challenges that matter Accelerate your career – Get hands-on experience, mentorship, and continuous learning opportunities Work with the best – Join 140,000+ bold thinkers and problem-solvers who push boundaries every day Thrive in a values-driven culture – Our courage, curiosity, and incisiveness - built on a foundation of integrity and inclusion - allow your ideas to fuel progress Come join the tech shapers and growth makers at Genpact and take your career in the only direction that matters: Up. Let’s build tomorrow together. Genpact is an Equal Opportunity Employer and considers applicants for all positions without regard to race, color , religion or belief, sex, age, national origin, citizenship status, marital status, military/veteran status, genetic information, sexual orientation, gender identity, physical or mental disability or any other characteristic protected by applicable laws. Genpact is committed to creating a dynamic work environment that values respect and integrity, customer focus, and innovation. Furthermore, please do note that Genpact does not charge fees to process job applications and applicants are not required to pay to participate in our hiring process in any other way. Examples of such scams include purchasing a 'starter kit,' paying to apply, or purchasing equipment or training. Job Principal Consultant Primary Location India-Kolkata Schedule Full-time Education Level Bachelor's / Graduation / Equivalent Job Posting Jun 15, 2025, 11:17:34 PM Unposting Date Ongoing Master Skills List Digital Job Category Full Time

Posted 22 hours ago

Apply

3.0 years

0 Lacs

Chennai, Tamil Nadu, India

On-site

Linkedin logo

Job Description Global Data Insight & Analytics organization is looking for a top-notch Software Engineer who has also got Machine Learning knowledge & Experience to add to our team to drive the next generation of AI/ML (Mach1ML) platform. In this role you will work in a small, cross-functional team. The position will collaborate directly and continuously with other engineers, business partners, product managers and designers from distributed locations, and will release early and often. The team you will be working on is focused on building Mach1ML platform – an AI/ML enablement platform to democratize Machine Learning across Ford enterprise (like OpenAI’s GPT, Facebook’s FBLearner, etc.) to deliver next-gen analytics innovation. We strongly believe that data has the power to help create great products and experiences which delight our customers. We believe that actionable and persistent insights, based on high quality data platform, help business and engineering make more impactful decisions. Our ambitions reach well beyond existing solutions, and we are in search of innovative individuals to join this Agile team. This is an exciting, fast-paced role which requires outstanding technical and organization skills combined with critical thinking, problem-solving and agile management tools to support team success. Responsibilities What you'll be able to do: As a Software Engineer, you will work on developing features for Mach1ML platform, support customers in model deployment using Mach1ML platform on GCP and On-prem. You will follow Rally to manage your work. You will incorporate an understanding of product functionality and customer perspective for model deployment. You will work on the cutting-edge technologies such as GCP, Kubernetes, Docker, Seldon, Tekton, Airflow, Rally, etc. Position Responsibilities: Work closely with Tech Anchor, Product Manager and Product Owner to deliver machine learning use cases using Ford Agile Framework. Work with Data Scientists and ML engineers to tackle challenging AI problems. Work specifically on the Deploy team to drive model deployment and AI/ML adoption with other internal and external systems. Help innovate by researching state-of-the-art deployment tools and share knowledge with the team. Lead by example in use of Paired Programming for cross training/upskilling, problem solving, and speed to delivery. Leverage latest GCP, CICD, ML technologies Critical Thinking: Able to influence the strategic direction of the company by finding opportunities in large, rich data sets and crafting and implementing data driven strategies that fuel growth including cost savings, revenue, and profit. Modelling: Assessments, and evaluating impacts of missing/unusable data, design and select features, develop, and implement statistical/predictive models using advanced algorithms on diverse sources of data and testing and validation of models, such as forecasting, natural language processing, pattern recognition, machine vision, supervised and unsupervised classification, decision trees, neural networks, etc. Analytics: Leverage rigorous analytical and statistical techniques to identify trends and relationships between different components of data, draw appropriate conclusions and translate analytical findings and recommendations into business strategies or engineering decisions - with statistical confidence Data Engineering: Experience with crafting ETL processes to source and link data in preparation for Model/Algorithm development. This includes domain expertise of data sets in the environment, third-party data evaluations, data quality Visualization: Build visualizations to connect disparate data, find patterns and tell engaging stories. This includes both scientific visualization as well as geographic using applications such as Seaborn, Qlik Sense/PowerBI/Tableau/Looker Studio, etc. Qualifications Minimum Requirements we seek: Bachelor’s or master’s degree in computer science engineering or related field or a combination of education and equivalent experience. 3+ years of experience in full stack software development 3+ years’ experience in Cloud technologies & services, preferably GCP 3+ years of experience of practicing statistical methods and their accurate application e.g. ANOVA, principal component analysis, correspondence analysis, k-means clustering, factor analysis, multi-variate analysis, Neural Networks, causal inference, Gaussian regression, etc. 3+ years’ experience with Python, SQL, BQ. Experience in SonarQube, CICD, Tekton, terraform, GCS, GCP Looker, Google cloud build, cloud run, Vertex AI, Airflow, TensorFlow, etc., Experience in Train, Build and Deploy ML, DL Models Experience in HuggingFace, Chainlit, Streamlit, React Ability to understand technical, functional, non-functional, security aspects of business requirements and delivering them end-to-end. Ability to adapt quickly with opensource products & tools to integrate with ML Platforms Building and deploying Models (Scikit learn, DataRobots, TensorFlow PyTorch, etc.) Developing and deploying On-Prem & Cloud environments Kubernetes, Tekton, OpenShift, Terraform, Vertex AI Our Preferred Requirements: Master’s degree in computer science engineering, or related field or a combination of education and equivalent experience. Demonstrated successful application of analytical methods and machine learning techniques with measurable impact on product/design/business/strategy. Proficiency in programming languages such as Python with a strong emphasis on machine learning libraries, generative AI frameworks, and monitoring tools. Utilize tools and technologies such as TensorFlow, PyTorch, scikit-learn, and other machine learning libraries to build and deploy machine learning solutions on cloud platforms. Design and implement cloud infrastructure using technologies such as Kubernetes, Terraform, and Tekton to support scalable and reliable deployment of machine learning models, generative AI models, and applications. Integrate machine learning and generative AI models into production systems on cloud platforms such as Google Cloud Platform (GCP) and ensure scalability, performance, and proactive monitoring. Implement monitoring solutions to track the performance, health, and security of systems and applications, utilizing tools such as Prometheus, Grafana, and other relevant monitoring tools. Conduct code reviews and provide constructive feedback to team members on machine learning-related projects. Knowledge and experience in agentic workflow based application development and DevOps Stay up to date with the latest trends and advancements in machine learning and data science. Show more Show less

Posted 22 hours ago

Apply

3.0 years

0 Lacs

Chennai, Tamil Nadu, India

On-site

Linkedin logo

Description Are you passionate about solving business challenges at a global scale? Amazon Employee Services is looking for an experienced Business Analyst to join Retail Business Services team and help unlock insights which take our business to the next level. The candidate will be excited about understanding and implementing new and repeatable processes to improve our employee global work authorization experiences. They will do this by partnering with key stakeholders to be curious and comfortable digging deep into the business challenges to understand and identify insights that will enable us to figure out standards to improve our ability to globally scale this program. They will be comfortable delivering/presenting these recommended solutions by retrieving and integrating artifacts in a format that is immediately useful to improve the business decision-making process. This role requires an individual with excellent analytical abilities as well as an outstanding business acumen. The candidate knows and values our customers (internal and external) and will work back from the customer to create structured processes for global expansions of work authorization, and help integrate new countries/new acquisitions into the existing program. They are experts in partnering and earning trust with operations/business leaders to drive these key business decisions. Responsibilities Own the development and maintenance of new and existing artifacts focused on analysis of requirements, metrics, and reporting dashboards. Partner with operations/business teams to consult, develop and implement KPI’s, automated reporting/process solutions, and process improvements to meet business needs. Enable effective decision making by retrieving and aggregating data from multiple sources and compiling it into a digestible and actionable format. Prepare and deliver business requirements reviews to the senior management team regarding progress and roadblocks. Participate in strategic and tactical planning discussions. Design, develop and maintain scaled, automated, user-friendly systems, reports, dashboards, etc. that will support our business needs. Excellent writing skills, to create artifacts easily digestible by business and tech partners. Key job responsibilities Design and develop highly available dashboards and metrics using SQL and Excel/Tableau/QuickSight Understand the requirements of stakeholders and map them with the data sources/data warehouse Own the delivery and backup of periodic metrics, dashboards to the leadership team Draw inferences and conclusions, and create dashboards and visualizations of processed data, identify trends, anomalies Execute high priority (i.e. cross functional, high impact) projects to improve operations performance with the help of Operations Analytics managers Perform business analysis and data queries using appropriate tools Work closely with internal stakeholders such as business teams, engineering teams, and partner teams and align them with respect to your focus area Basic Qualifications 3+ years of Excel or Tableau (data manipulation, macros, charts and pivot tables) experience Experience defining requirements and using data and metrics to draw business insights Experience with SQL or ETL Knowledge of data visualization tools such as Quick Sight, Tableau, Power BI or other BI packages 1+ years of tax, finance or a related analytical field experience Preferred Qualifications Experience in Amazon Redshift and other AWS technologies Experience creating complex SQL queries joining multiple datasets, ETL DW concepts Experience in SCALA and Pyspark Our inclusive culture empowers Amazonians to deliver the best results for our customers. If you have a disability and need a workplace accommodation or adjustment during the application and hiring process, including support for the interview or onboarding process, please visit https://amazon.jobs/content/en/how-we-hire/accommodations for more information. If the country/region you’re applying in isn’t listed, please contact your Recruiting Partner. Company - ADCI - BLR 14 SEZ Job ID: A3009262 Show more Show less

Posted 22 hours ago

Apply

5.0 years

0 Lacs

Chennai, Tamil Nadu, India

On-site

Linkedin logo

You Lead the Way. We’ve Got Your Back. With the right backing, people and businesses have the power to progress in incredible ways. When you join Team Amex, you become part of a global and diverse community of colleagues with an unwavering commitment to back our customers, communities and each other. Here, you’ll learn and grow as we help you create a career journey that’s unique and meaningful to you with benefits, programs, and flexibility that support you personally and professionally. At American Express, you’ll be recognized for your contributions, leadership, and impact—every colleague has the opportunity to share in the company’s success. Together, we’ll win as a team, striving to uphold our company values and powerful backing promise to provide the world’s best customer experience every day. And we’ll do it with the utmost integrity, and in an environment where everyone is seen, heard and feels like they belong. As part of our diverse tech team, you can architect, code and ship software that makes us an essential part of our customers’ digital lives. Here, you can work alongside talented engineers in an open, supportive, inclusive environment where your voice is valued, and you make your own decisions on what tech to use to solve challenging problems. Amex offers a range of opportunities to work with the latest technologies and encourages you to back the broader engineering community through open source. And because we understand the importance of keeping your skills fresh and relevant, we give you dedicated time to invest in your professional development. Find your place in technology on #TeamAmex. How will you make an impact in this role? Build NextGen Data Strategy, Data Virtualization, Data Lakes Warehousing Transform and improve performance of existing reporting & analytics use cases with more efficient and state of the art data engineering solutions. Analytics Development to realize advanced analytics vision and strategy in a scalable, iterative manner. Deliver software that provides superior user experiences, linking customer needs and business drivers together through innovative product engineering. Cultivate an environment of Engineering excellence and continuous improvement, leading changes that drive efficiencies into existing Engineering and delivery processes. Own accountability for all quality aspects and metrics of product portfolio, including system performance, platform availability, operational efficiency, risk management, information security, data management and cost effectiveness. Work with key stakeholders to drive Software solutions that align to strategic roadmaps, prioritized initiatives and strategic Technology directions. Work with peers, staff engineers and staff architects to assimilate new technology and delivery methods into scalable software solutions. Minimum Qualifications: Bachelor’s degree in computer science, Computer Science Engineering, or related field required; Advanced Degree preferred. 5+ years of hands-on experience in implementing large data-warehousing projects, strong knowledge of latest NextGen BI & Data Strategy & BI Tools Proven experience in Business Intelligence, Reporting on large datasets, Data Virtualization Tools, Big Data, GCP, JAVA, Microservices Strong systems integration architecture skills and a high degree of technical expertise, ranging across a number of technologies with a proven track record of turning new technologies into business solutions. Should be good in one programming language python/Java. Should have good understanding of data structures. GCP /cloud knowledge has added advantage. PowerBI, Tableau and looker good knowledge and understanding. Outstanding influential and collaboration skills; ability to drive consensus and tangible outcomes, demonstrated by breaking down silos and fostering cross communication process. Experience managing in a fast paced, complex, and dynamic global environment. Outstanding influential and collaboration skills; ability to drive consensus and tangible outcomes, demonstrated by breaking down silos and fostering cross communication process. Preferred Qualifications: Bachelor’s degree in computer science, Computer Science Engineering, or related field required; Advanced Degree preferred. 5+ years of hands-on experience in implementing large data-warehousing projects, strong knowledge of latest NextGen BI & Data Strategy & BI Tools Proven experience in Business Intelligence, Reporting on large datasets, Oracle Business Intelligence (OBIEE), Tableau, MicroStrategy, Data Virtualization Tools, Oracle PL/SQL, Informatica, Other ETL Tools like Talend, Java Should be good in one programming language python/Java. Should be good data structures and reasoning. GCP knowledge has added advantage or cloud knowledge. PowerBI, Tableau and looker good knowledge and understanding. Strong systems integration architecture skills and a high degree of technical expertise, ranging across several technologies with a proven track record of turning new technologies into business solutions. Outstanding influential and collaboration skills; ability to drive consensus and tangible outcomes, demonstrated by breaking down silos and fostering cross communication process. We back you with benefits that support your holistic well-being so you can be and deliver your best. This means caring for you and your loved ones' physical, financial, and mental health, as well as providing the flexibility you need to thrive personally and professionally: Competitive base salaries Bonus incentives Support for financial-well-being and retirement Comprehensive medical, dental, vision, life insurance, and disability benefits (depending on location) Flexible working model with hybrid, onsite or virtual arrangements depending on role and business need Generous paid parental leave policies (depending on your location) Free access to global on-site wellness centers staffed with nurses and doctors (depending on location) Free and confidential counseling support through our Healthy Minds program Career development and training opportunities American Express is an equal opportunity employer and makes employment decisions without regard to race, color, religion, sex, sexual orientation, gender identity, national origin, veteran status, disability status, age, or any other status protected by law. Offer of employment with American Express is conditioned upon the successful completion of a background verification check, subject to applicable laws and regulations. Show more Show less

Posted 22 hours ago

Apply

6.0 years

0 Lacs

Andhra Pradesh

On-site

Job Description: We are hiring for Abinitio ETL Lead with at least 6+ years of experience in: who can not only design and implement ETL solutions but also perform data analysis and production support activities . The ideal candidate will manage the end-to-end development lifecycle of ETL processes while providing leadership on data analysis tasks to ensure accurate and actionable insights. Additionally, the role involves providing production support to maintain the stability and performance of critical data pipelines. About Virtusa Teamwork, quality of life, professional and personal development: values that Virtusa is proud to embody. When you join us, you join a team of 27,000 people globally that cares about your growth — one that seeks to provide you with exciting projects, opportunities and work with state of the art technologies throughout your career with us. Great minds, great potential: it all comes together at Virtusa. We value collaboration and the team environment of our company, and seek to provide great minds with a dynamic place to nurture new ideas and foster excellence. Virtusa was founded on principles of equal opportunity for all, and so does not discriminate on the basis of race, religion, color, sex, gender identity, sexual orientation, age, non-disqualifying physical or mental disability, national origin, veteran status or any other basis covered by appropriate law. All employment is decided on the basis of qualifications, merit, and business need.

Posted 22 hours ago

Apply

0 years

0 Lacs

Pune, Maharashtra, India

On-site

Linkedin logo

Job Reference # 319010BR Job Type Full Time Your role As a Senior Azure Data Engineer, you will work in one of the largest and most complex CRM applications buildings. You will work closely with Product Management, Software Development, Data Engineering and other teams to develop scalable and innovative CRM solutions. Your role will be accountable for design /implementation of technical solutions within WMA and timely delivery of projects following agile / scrum SDLC. Your team Are you an enthusiastic technology professional? Are you excited about seeking an enriching career, working for one of finest financial institutions in world? If so, you are the right person for this role. We are seeking technology and domain experts to join our Dynamics D365 CRM development team. We are responsible for WMA (Wealth Management Americas) clients facing technology applications. You’ll be working in the WMA CRM crew focusing on building applications which are used by financial advisors. Our team is dedicated to creating innovative solutions that drive our organization's success. We foster a collaborative and supportive environment, where you can grow and excel in your role. Your expertise expert level skills writing and optimizing complex sql and python is a must experience with complex data modelling, etl design, and using large databases in a business environment experience with large scale on-prem to cloud data-migration experience with big-data technologies, data-lake/delta-lake solutions, cloud-based data platforms experience in designing data pipelines for optimal performance, resiliency, and cost efficiency & fluent with industry standard technologies like spark, kafka expert level understanding of azure data ecosystem like azure data factory, azure synapse, azure sql, azure data lake, azure fabric, azure cosmos db is required experience is designing data integration strategies leveraging batch and real-time data ingestion methods implementing data security measures, including encryption, access controls, and auditing, to protect sensitive information automating data pipelines and workflows to streamline data ingestion, processing, and distribution tasks About Us UBS is the world’s largest and the only truly global wealth manager. We operate through four business divisions: Global Wealth Management, Personal & Corporate Banking, Asset Management and the Investment Bank. Our global reach and the breadth of our expertise set us apart from our competitors. We have a presence in all major financial centers in more than 50 countries. How We Hire We may request you to complete one or more assessments during the application process. Learn more Join us At UBS, we know that it's our people, with their diverse skills, experiences and backgrounds, who drive our ongoing success. We’re dedicated to our craft and passionate about putting our people first, with new challenges, a supportive team, opportunities to grow and flexible working options when possible. Our inclusive culture brings out the best in our employees, wherever they are on their career journey. We also recognize that great work is never done alone. That’s why collaboration is at the heart of everything we do. Because together, we’re more than ourselves. We’re committed to disability inclusion and if you need reasonable accommodation/adjustments throughout our recruitment process, you can always contact us. Disclaimer / Policy Statements UBS is an Equal Opportunity Employer. We respect and seek to empower each individual and support the diverse cultures, perspectives, skills and experiences within our workforce. Show more Show less

Posted 22 hours ago

Apply

3.0 years

0 Lacs

Bengaluru, Karnataka, India

On-site

Linkedin logo

Description Are you passionate about solving business challenges at a global scale? Amazon Employee Services is looking for an experienced Business Analyst to join Retail Business Services team and help unlock insights which take our business to the next level. The candidate will be excited about understanding and implementing new and repeatable processes to improve our employee global work authorization experiences. They will do this by partnering with key stakeholders to be curious and comfortable digging deep into the business challenges to understand and identify insights that will enable us to figure out standards to improve our ability to globally scale this program. They will be comfortable delivering/presenting these recommended solutions by retrieving and integrating artifacts in a format that is immediately useful to improve the business decision-making process. This role requires an individual with excellent analytical abilities as well as an outstanding business acumen. The candidate knows and values our customers (internal and external) and will work back from the customer to create structured processes for global expansions of work authorization, and help integrate new countries/new acquisitions into the existing program. They are experts in partnering and earning trust with operations/business leaders to drive these key business decisions. Responsibilities Own the development and maintenance of new and existing artifacts focused on analysis of requirements, metrics, and reporting dashboards. Partner with operations/business teams to consult, develop and implement KPI’s, automated reporting/process solutions, and process improvements to meet business needs. Enable effective decision making by retrieving and aggregating data from multiple sources and compiling it into a digestible and actionable format. Prepare and deliver business requirements reviews to the senior management team regarding progress and roadblocks. Participate in strategic and tactical planning discussions. Design, develop and maintain scaled, automated, user-friendly systems, reports, dashboards, etc. that will support our business needs. Excellent writing skills, to create artifacts easily digestible by business and tech partners. Key job responsibilities Design and develop highly available dashboards and metrics using SQL and Excel/Tableau/QuickSight Understand the requirements of stakeholders and map them with the data sources/data warehouse Own the delivery and backup of periodic metrics, dashboards to the leadership team Draw inferences and conclusions, and create dashboards and visualizations of processed data, identify trends, anomalies Execute high priority (i.e. cross functional, high impact) projects to improve operations performance with the help of Operations Analytics managers Perform business analysis and data queries using appropriate tools Work closely with internal stakeholders such as business teams, engineering teams, and partner teams and align them with respect to your focus area Basic Qualifications 3+ years of Excel or Tableau (data manipulation, macros, charts and pivot tables) experience Experience defining requirements and using data and metrics to draw business insights Experience with SQL or ETL Knowledge of data visualization tools such as Quick Sight, Tableau, Power BI or other BI packages 1+ years of tax, finance or a related analytical field experience Preferred Qualifications Experience in Amazon Redshift and other AWS technologies Experience creating complex SQL queries joining multiple datasets, ETL DW concepts Experience in SCALA and Pyspark Our inclusive culture empowers Amazonians to deliver the best results for our customers. If you have a disability and need a workplace accommodation or adjustment during the application and hiring process, including support for the interview or onboarding process, please visit https://amazon.jobs/content/en/how-we-hire/accommodations for more information. If the country/region you’re applying in isn’t listed, please contact your Recruiting Partner. Company - ADCI - BLR 14 SEZ Job ID: A3009262 Show more Show less

Posted 22 hours ago

Apply

8.0 years

0 Lacs

Hyderabad, Telangana, India

On-site

Linkedin logo

Job Title: Data Engineer Location: Within 2 hours driving distance of Crown Equipment offices Employment Type: Full-time Salary: Approx. $100,000 (commensurate with experience) About the Role: Crown Equipment is seeking an experienced Data Engineer to join our data and analytics team. The ideal candidate will have 5–8 years of experience in data engineering and will play a key role in maintaining and enhancing our data infrastructure. This is a hybrid role requiring in-office presence twice per quarter for collaborative team meetings. Key Responsibilities: Develop and maintain robust ETL/ELT pipelines using SSIS Design and optimize SQL queries and stored procedures in SQL Server Perform data modeling and schema design to support business intelligence and reporting needs Leverage MySQL and other database technologies for cross-platform integration Work with cloud tools such as Azure or AWS for scalable data processing and storage Collaborate with analysts, business users, and other engineers to gather requirements and deliver high-quality data solutions Ensure data quality, reliability, and performance across data systems Required Qualifications: Minimum of 5 years of hands-on data engineering experience (8 years preferred) Strong command of SSIS and SQL Server Proficient in writing complex SQL queries Familiarity with MySQL Experience with Azure, AWS, or other cloud data services is a plus Strong problem-solving skills and attention to detail Excellent communication and collaboration skills Additional Information: Must reside within 2 hours driving distance of a Crown Equipment office Must attend in-person team meetings twice per quarter No visa sponsorship is available for this role. Show more Show less

Posted 22 hours ago

Apply

0 years

0 Lacs

Bengaluru, Karnataka, India

On-site

Linkedin logo

Role : ETL Test Analyst with Databricks JOB LOCATION : Bangalore, Bhubaneswar EXPERIENCE REQUIREMENT : 4+ Technical Skill Set : ETL Testing, Strong SQL, Data bricks Must have: Good experience in developing SQL queries for testing complex ETL transformations Proficiency in ETL Testing, Data Validation, Data Warehouse Testing Proficiency in writing and executing SQL queries for data validation and testing Good knowledge of Data warehousing QA concepts Hands on Experience on Databricks Thorough Understanding of Databricks-based ETL workflows, including data ingestion, transformation, and validation. Knowledge in Test Strategy, Test Plan, Test Case creation, STLC, Bug Life Cycle Experience with Agile & QA tools Secondary Skills. Good to have: Knowledge in Pyspark Basic understanding of Dax code in Power BI Proficient knowledge in Azure DevOps, SDLC, STLC, DevOps and CI/CD processes. Show more Show less

Posted 22 hours ago

Apply

0 years

0 Lacs

Hyderabad, Telangana, India

On-site

Linkedin logo

Overview C5i C5i is a pure-play AI & Analytics provider that combines the power of human perspective with AI technology to deliver trustworthy intelligence. The company drives value through a comprehensive solution set, integrating multifunctional teams that have technical and business domain expertise with a robust suite of products, solutions, and accelerators tailored for various horizontal and industry-specific use cases. At the core, C5i’s focus is to deliver business impact at speed and scale by driving adoption of AI-assisted decision-making. C5i caters to some of the world’s largest enterprises, including many Fortune 500 companies. The company’s clients span Technology, Media, and Telecom (TMT), Pharma & Lifesciences, CPG, Retail, Banking, and other sectors. C5i has been recognized by leading industry analysts like Gartner and Forrester for its Analytics and AI capabilities and proprietary AI-based platforms. Global offices United States | Canada | United Kingdom | United Arab of Emirates | India Job Summary We are looking for experienced Data Modelers to support large-scale data engineering and analytics initiatives. The role involves developing logical and physical data models, working closely with business and engineering teams to define data requirements, and ensuring alignment with enterprise standards. • Independently complete conceptual, logical and physical data models for any supported platform, including SQL Data Warehouse, Spark, Data Bricks Delta Lakehouse or other Cloud data warehousing technologies. • Governs data design/modelling – documentation of metadata (business definitions of entities and attributes) and constructions database objects, for baseline and investment funded projects, as assigned. • Develop a deep understanding of the business domains like Customer, Sales, Finance, Supplier, and enterprise technology inventory to craft a solution roadmap that achieves business objectives, maximizes reuse. • Drive collaborative reviews of data model design, code, data, security features to drive data product development. • Show expertise for data at all levels: low-latency, relational, and unstructured data stores; analytical and data lakes; SAP Data Model. • Develop reusable data models based on cloud-centric, code-first approaches to data management and data mapping. • Partner with the data stewards team for data discovery and action by business customers and stakeholders. • Provides and/or supports data analysis, requirements gathering, solution development, and design reviews for enhancements to, or new, applications/reporting. • Assist with data planning, sourcing, collection, profiling, and transformation. • Support data lineage and mapping of source system data to canonical data stores. • Create Source to Target Mappings (STTM) for ETL and BI developers. Skills needed: • Expertise in data modelling tools (ER/Studio, Erwin, IDM/ARDM models, CPG / Manufacturing/Sales/Finance/Supplier/Customer domains ) • Experience with at least one MPP database technology such as Databricks Lakehouse, Redshift, Synapse, Teradata, or Snowflake. • Experience with version control systems like GitHub and deployment & CI tools. • Experience of metadata management, data lineage, and data glossaries is a plus. • Working knowledge of agile development, including DevOps and DataOps concepts. • Working knowledge of SAP data models, particularly in the context of HANA and S/4HANA, Retails Data like IRI, Nielsen Retail. C5i is proud to be an equal opportunity employer. We are committed to equal employment opportunity regardless of race, color, religion, sex, sexual orientation, age, marital status, disability, gender identity, etc. If you have a disability or special need that requires accommodation, please keep us informed about the same at the hiring stages for us to factor necessary accommodations. Show more Show less

Posted 22 hours ago

Apply

8.0 years

0 Lacs

Greater Hyderabad Area

On-site

Linkedin logo

Data Test Engineer (Snowflake, Python, DB Testing, SQL) Job Location: - Hyderabad, Pune Or Gurugram Experience Level: Mid-Senior (8+ Years) About Kellton: We are a global IT services and digital product design and development company with subsidiaries that serve startup, mid-market, and enterprise clients across diverse industries, including Finance, Healthcare, Manufacturing, Retail, Government, and Nonprofits. At Kellton, we believe that our people are our greatest asset. We are committed to fostering a culture of collaboration, innovation, and continuous learning. Our core values include integrity, customer focus, teamwork, and excellence. To learn more about our organization, please visit us at www.kellton.com Are you craving a dynamic and autonomous work environment? If so, this opportunity may be just what you're looking for. At our company, we value your critical thinking skills and encourage your input and creative ideas to supply the best talent available. To boost your productivity, we provide a comprehensive suite of IT tools and practices backed by an experienced team to work with. About the Role: We are seeking a highly skilled DBT Engineer to take charge of migrating ETL pipelines from SSIS (SQL Server Integration Services) to DBT (Data Build Tool) and developing new DBT pipelines. The candidate will have strong experience in building data pipelines, working with SQL Server, and leveraging DBT for data transformation. This role requires deep knowledge of modern data platforms and the ability to ensure a smooth transition from legacy systems to modern architectures. What you will do: Develop and execute unit tests for migrated SQL Server stored procedures rewritten in Snowflake SQL to ensure functionality and performance are consistent with the original logic. Validate data accuracy, consistency, and performance in the new Snowflake environment. Create and execute unit tests and integration tests for DBT pipelines, ensuring that transformations are accurate, reliable, and performant. Implement DBT test suites to verify the correctness of data models, transformation logic, and data quality. Validate the output of scripts against expected results and ensure efficient processing in the AWS/Snowflake cloud infrastructure. Perform end-to-end testing for the entire data flow, from DBT pipelines and Snowflake transformations to report generation via Python/R scripts. Required Skills and Qualifications: Proven experience in testing data pipelines and ETL processes using DBT or similar tools. Experience testing SQL transformations, particularly migrating from SQL Server to Snowflake SQL. Hands-on experience with testing Python and R scripts, particularly in data processing or analytical environments. Experience working in data migration projects, particularly to cloud environments like Snowflake. Strong knowledge of SQL for writing test cases and validating query results. Experience with DBT testing frameworks, including setting up and running DBT tests. Proficiency in Python and R for testing and debugging data transformation scripts. Familiarity with version control systems (e.g., Git) for managing test scripts and test case versioning. Knowledge of CI/CD pipelines for integrating automated testing into the development process. Familiarity with cloud platforms (AWS) and cloud-native databases like Snowflake. Experience in performance testing and load testing for SQL queries, pipelines, and scripts running on large datasets. What we offer you: · Existing clients in multiple domains to work. · Strong and efficient team committed to quality output. · Enhance your knowledge and gain industry domain expertise by working in varied roles. · A team of experienced, fun, and collaborative colleagues · Hybrid work arrangement for flexibility and work-life balance (If the client/project allows) · Competitive base salary and job satisfaction. Join our team and become part of an exciting company where your expertise and ideas are valued, and where you can make a significant impact in the IT industry. Apply today! Interested applicants, please submit your detailed resume stating your current and expected compensation and notice period to srahaman@kellton.com Show more Show less

Posted 22 hours ago

Apply

2.5 years

0 Lacs

Mumbai, Maharashtra, India

On-site

Linkedin logo

SQL-ETL Developer Role & responsibilities: Preferred candidate profile: - 2.5+ years of experience in SQL-ETL Developing, Informatica , SQL, Stored Procedures & Hands-on. Other Details: Experience: 2.5 + years Location: Mumbai Desirable: Experienced in SQL-ETL Developing, Informatica , SQL, Stored Procedures. Work Mode: Work from Office Immediate Joining (30 days Notice Period Preferred) Company Description Infocus Technologies Pvt Ltd is a Kolkata-based consulting company that provides SAP, ERP & Cloud consulting services. The company is an ISO 9001:2015 DNV certified, CMMI Level 3 Certified company, and a Gold partner of SAP in Eastern India. Infocus helps customers to migrate and host SAP infrastructure on AWS cloud. Its services in the ERP domain include implementation, version upgrades, and Enterprise Application Integration (EAI) solutions. Show more Show less

Posted 22 hours ago

Apply
cta

Start Your Job Search Today

Browse through a variety of job opportunities tailored to your skills and preferences. Filter by location, experience, salary, and more to find your perfect fit.

Job Application AI Bot

Job Application AI Bot

Apply to 20+ Portals in one click

Download Now

Download the Mobile App

Instantly access job listings, apply easily, and track applications.

Featured Companies