Home
Jobs

13457 Etl Jobs - Page 42

Filter
Filter Interviews
Min: 0 years
Max: 25 years
Min: ₹0
Max: ₹10000000
Setup a job Alert
JobPe aggregates results for easy application access, but you actually apply on the job portal directly.

4.0 years

0 Lacs

India

Remote

Linkedin logo

We're Hiring: Integration Consultant (Kinaxis Maestro) Work from Anywhere | Remote Opportunity Full-Time | 2–4 Years Experience Are you a Kinaxis Maestro pro with a passion for building seamless integrations that drive supply chain transformation? Join Simbus Tech, a trusted Kinaxis partner, and work with global clients to enable smarter, faster supply chain decisions. What You’ll Do Design and implement robust integration solutions for Kinaxis RapidResponse using Maestro. Develop and maintain data pipelines using APIs, ETL tools, and middleware. Collaborate with functional and technical teams to align integration strategies with business needs. Troubleshoot integration issues and ensure data consistency across systems. Contribute to continuous improvement and documentation of integration best practices. What We’re Looking For Bachelor’s degree in Computer Science, Information Technology, or a related field. 2–4 years of experience as a Kinaxis Integration Consultant or similar role. Strong hands-on expertise in Kinaxis Maestro and integration frameworks. Proficiency in APIs, ETL processes, and tools like Talend, Mulesoft, or Snowflake. Relevant certifications in Kinaxis Maestro or integration technologies (preferred). Strong problem-solving skills, communication, and a collaborative mindset. Why Join Simbus Tech? 100% Remote | Work from Anywhere Exposure to cutting-edge supply chain technologies Collaborate with top-tier global clients Growth-driven, people-first culture Show more Show less

Posted 4 days ago

Apply

5.0 years

0 Lacs

Pune, Maharashtra, India

On-site

Linkedin logo

Our Purpose Mastercard powers economies and empowers people in 200+ countries and territories worldwide. Together with our customers, we’re helping build a sustainable economy where everyone can prosper. We support a wide range of digital payments choices, making transactions secure, simple, smart and accessible. Our technology and innovation, partnerships and networks combine to deliver a unique set of products and services that help people, businesses and governments realize their greatest potential. Title And Summary Lead, Data Engineer Who is Mastercard? Mastercard is a global technology company in the payments industry. Our mission is to connect and power an inclusive, digital economy that benefits everyone, everywhere by making transactions safe, simple, smart, and accessible. Using secure data and networks, partnerships and passion, our innovations and solutions help individuals, financial institutions, governments, and businesses realize their greatest potential. Our decency quotient, or DQ, drives our culture and everything we do inside and outside of our company. With connections across more than 210 countries and territories, we are building a sustainable world that unlocks priceless possibilities for all. Overview The Mastercard Services Technology team is looking for a Lead in Data Engineering, to drive our mission to unlock potential of data assets by consistently innovating, eliminating friction in how we manage big data assets, store those assets, accessibility of data and, enforce standards and principles in the Big Data space both on public cloud and on-premises set up. We are looking for a hands-on, passionate Data Engineer who is not only technically strong in PySpark, cloud platforms, and building modern data architectures, but also deeply committed to learning, growing, and lifting others. The person will play a key role in designing and building scalable data solutions, shaping our engineering culture, and mentoring team members. This is a role for builders and collaborators—engineers who love clean data pipelines, cloud-native design, and helping teammates succeed. Role Design and build scalable, cloud-native data platforms using PySpark, Python, and modern data engineering practices. Mentor and guide other engineers, sharing knowledge, reviewing code, and fostering a culture of curiosity, growth, and continuous improvement. Create robust, maintainable ETL/ELT pipelines that integrate with diverse systems and serve business-critical use cases. Lead by example—write high-quality, testable code and participate in architecture and design discussions with a long-term view in mind. Decompose complex problems into modular, efficient, and scalable components that align with platform and product goals. Champion best practices in data engineering, including testing, version control, documentation, and performance tuning. Drive collaboration across teams, working closely with product managers, data scientists, and other engineers to deliver high-impact solutions. Support data governance and quality efforts, ensuring data lineage, cataloging, and access management are built into the platform. Continuously learn and apply new technologies, frameworks, and tools to improve team productivity and platform reliability. Own and optimize cloud infrastructure components related to data engineering workflows, storage, processing, and orchestration. Participate in architectural discussions, iteration planning, and feature sizing meetings Adhere to Agile processes and participate actively in agile ceremonies Stakeholder management skills All About You 5+ years of hands-on experience in data engineering with strong PySpark and Python skills. Solid experience designing and implementing data models, pipelines, and batch/stream processing systems. Proven ability to work with cloud platforms (AWS, Azure, or GCP), especially in data-related services like S3, Glue, Data Factory, Databricks, etc. Strong foundation in data modeling, database design, and performance optimization. Understanding of modern data architectures (e.g., lakehouse, medallion) and data lifecycle management. Comfortable with CI/CD practices, version control (e.g., Git), and automated testing. Demonstrated ability to mentor and uplift junior engineers—strong communication and collaboration skills. Bachelor’s degree in computer science, Engineering, or related field—or equivalent hands-on experience. Comfortable working in Agile/Scrum development environments. Curious, adaptable, and driven by problem-solving and continuous improvement. Good To Have Experience integrating heterogeneous systems and building resilient data pipelines across cloud environments. Familiarity with orchestration tools (e.g., Airflow, dbt, Step Functions, etc.). Exposure to data governance tools and practices (e.g., Lake Formation, Purview, or Atlan). Experience with containerization and infrastructure automation (e.g., Docker, Terraform) will be a good addition. Master’s degree, relevant certifications (e.g., AWS Certified Data Analytics, Azure Data Engineer), or demonstrable contributions to open source/data engineering communities will be a bonus. Exposure to machine learning data pipelines or MLOps is a plus. Corporate Security Responsibility All activities involving access to Mastercard assets, information, and networks comes with an inherent risk to the organization and, therefore, it is expected that every person working for, or on behalf of, Mastercard is responsible for information security and must: Abide by Mastercard’s security policies and practices; Ensure the confidentiality and integrity of the information being accessed; Report any suspected information security violation or breach, and Complete all periodic mandatory security trainings in accordance with Mastercard’s guidelines. R-251380 Show more Show less

Posted 4 days ago

Apply

3.0 years

0 Lacs

Pune, Maharashtra, India

Remote

Linkedin logo

Entity: Technology Job Family Group: IT&S Group Job Description: Work location Pune You will work with A multi-disciplinary squad, crafting, developing, and operating Palantir Foundry based applications for production & operations business. Our teams use Palantir Foundry and various data engineering technologies for data pipeline, data modelling, build and operate business critical data-driven solutions. You will use AI/ML and LLMs to drive business efficiency. Let me tell you about the role As an enterprise engineer with experience of Palantir Foundry and data engineering technology, you will work with engineers that bring diverse set experiences in developing and maintaining applications across production and operations. What you will deliver Build and Maintain applications on Palantir Foundry platform. Develop and optimize data pipelines and workflows. Perform data integration, analysis, and visualization tasks; ensure data quality and integrity. Identify and solve issues within the Palantir Foundry environment. Collaborate with multi-functional teams to deliver data-driven solutions; provide technical support and training to team members. What you will need to be successful (experience and qualifications) Technical Skills Bachelor’s degree in Computer Science, Engineering, Computer Information Systems, with prior experience in software and platform engineering. 3+ years of hands-on Palantir Foundry experience - with understanding of Ontology, Code Repositories, Pipeline Builder, Workshop, Quiver, Contour Experience with data integration and ETL processes. Knowledge of scripting and programming languages such as Python, Spark, Scala and SQL. Awareness of software engineering practices & standard methodologies for full SDLC, including coding standards, code reviews, source control management, continuous deployments, testing, and operations Collaboration skills; should be able to engage and influence others to collect requirements, describe what you’re doing, work through problems, and find productive solutions. Inter personal skills for partnering with customers and senior leadership. About Bp Our purpose is to deliver energy to the world, today and tomorrow. For over 100 years, bp has focused on discovering, developing, and producing oil and gas in the nations where we operate. We are one of the few companies globally that can provide governments and customers with an integrated energy offering. Delivering our strategy sustainably is fundamental to achieving our ambition to be a net zero company by 2050 or sooner! We will ensure that individuals with disabilities are provided reasonable accommodation to participate in the job application or interview process, to perform crucial job functions, and to receive other benefits and privileges of employment. Please contact us to request accommodation. Additional Information We are an equal opportunity employer and value diversity at our company. We do not discriminate on the basis of race, religion, color, national origin, sex, gender, gender expression, sexual orientation, age, marital status, veteran status, or disability status. Even though the job is advertised as full time, please contact the hiring manager or the recruiter as flexible working arrangements may be considered. Travel Requirement Up to 10% travel should be expected with this role Relocation Assistance: This role is eligible for relocation within country Remote Type: This position is a hybrid of office/remote working Skills: Legal Disclaimer: We are an equal opportunity employer and value diversity at our company. We do not discriminate on the basis of race, religion, color, national origin, sex, gender, gender expression, sexual orientation, age, marital status, socioeconomic status, neurodiversity/neurocognitive functioning, veteran status or disability status. Individuals with an accessibility need may request an adjustment/accommodation related to bp’s recruiting process (e.g., accessing the job application, completing required assessments, participating in telephone screenings or interviews, etc.). If you would like to request an adjustment/accommodation related to the recruitment process, please contact us. If you are selected for a position and depending upon your role, your employment may be contingent upon adherence to local policy. This may include pre-placement drug screening, medical review of physical fitness for the role, and background checks. Show more Show less

Posted 4 days ago

Apply

8.0 years

0 Lacs

Pune, Maharashtra, India

Remote

Linkedin logo

Entity: Technology Job Family Group: IT&S Group Job Description: Enterprise Technology Engineers in bp bp is reinventing itself and digital capability is at the core of this vision. As a Senior Enterprise Technology Engineer you are a digital expert bringing deep specialist expertise to bp. Enterprise Technology Engineers work on the strategic technology platforms we exploit from the market, or come with deep skills in the implementation and integration of market solutions into our overall technology landscape. You will bring a broad base of Digital technical knowledge and a strong understanding of software delivery principles. You will be familiar with lifecycle methods, with Agile delivery and the DevOps approach at the core. You will be skilled in the application of approaches such as Site Reliability Engineering in the delivery and operation of the technologies you deliver, working as part of multi disciplinary squads. You thrive in a culture of continuous improvement within teams, encouraging and empowering innovation and the delivery of changes that optimise operational efficiency and user experience. You are curious and improve your skills through continuous learning of new technologies, trends & methods, applying knowledge gained to improve bp standards and the capabilities of the Engineering Community. You coach others in the Field to drive improved performance across our business. You embrace a culture of change and agility, evolving continuously, adapting to our changing world. You are an effective great teammate, looking beyond your own area/organizational boundaries to consider the bigger picture and/or perspective of others, while understanding cultural differences. You continually enhance your self-awareness and seek guidance from others on your impact and effectiveness. Well organized, you balance proactive and reactive approaches and multiple priorities to complete tasks on time. You apply judgment and common sense – you use insight and good judgment to inform actions and respond to situations as they arise. Key Accountabilities Technical lead for invoice processing application called eBilling Managing reliability of service and delivering to agreed SLA Collaborating with platform and security teams for patching and vulnerability management The safety of our people and our customers is our highest priority. The role will advocate and lead in this and promote security and safety in everything that we do. Work as part of evolving multi disciplinary teams which may include Software Engineers, Enterprise Technology, Engineers, Designers, SecOps, and Product owners to deliver value through the application of specialist skills Work with vendors and partners providing market solutions to optimize the usage and value which can be delivered from the appropriate technology platform Ensure operational integrity of what you build, assuring operational compliance with architectural and security standards, as well as compliance and policy controls refined by Strategy. Mentoring and become a conduit to connect the broader organization. Define and document standard run books and operating procedures. Create and maintain system information and architecture diagrams Education A first degree from a recognized institute of higher learning, ideally computer science or engineering based. Essential Experience And Job Requirements Total 8+ Years experience with Good knowledge of the Order to Cash process (preferably with Aviation domain) Informatica ETL MS SQL Data Integration Patterns (preferably with XML invoice processing) Experience with leading teams Demonstrable Knowledge of modern Service Delivery methods - Site Reliability Engineering to traditional ITIL, and understanding of Product Based delivery Strong Communications skills and a high ‘EQ’ with the ability to operate across complex business environments and collaborators up to senior executive level Desirable criteria Project Management experience delivering IT led projects Broad experience contributing and collaborating to assist design, plan, implement, maintain, and document services and solutions Development experience in one or more object-oriented or applicable programming languages (e.g. Python, Go, Java, C/C++) Skills That Set You Apart Passion for mentoring and coaching engineers in both technical and soft skills You focus on delighting customers with outstanding user experiences and customer service You are comfortable operating in an environment that is loosely coupled but tightly aligned toward a shared vision About Bp Our purpose is to deliver energy to the world, today and tomorrow. For over 100 years, bp has focused on discovering, developing, and producing oil and gas in the nations where we operate. We are one of the few companies globally that can provide governments and customers with an integrated energy offering. Delivering our strategy sustainably is fundamental to achieving our ambition to be a net zero company by 2050 or sooner! Travel Requirement Up to 10% travel should be expected with this role Relocation Assistance: This role is eligible for relocation within country Remote Type: This position is a hybrid of office/remote working Skills: Agility core practices, Agility core practices, Analytics, API and platform design, Business Analysis, Cloud Platforms, Coaching, Communication, Configuration management and release, Continuous deployment and release, Data Structures and Algorithms (Inactive), Digital Project Management, Documentation and knowledge sharing, Facilitation, Information Security, iOS and Android development, Mentoring, Metrics definition and instrumentation, NoSql data modelling, Relational Data Modelling, Risk Management, Scripting, Service operations and resiliency, Software Design and Development, Source control and code management {+ 4 more} Legal Disclaimer: We are an equal opportunity employer and value diversity at our company. We do not discriminate on the basis of race, religion, color, national origin, sex, gender, gender expression, sexual orientation, age, marital status, socioeconomic status, neurodiversity/neurocognitive functioning, veteran status or disability status. Individuals with an accessibility need may request an adjustment/accommodation related to bp’s recruiting process (e.g., accessing the job application, completing required assessments, participating in telephone screenings or interviews, etc.). If you would like to request an adjustment/accommodation related to the recruitment process, please contact us. If you are selected for a position and depending upon your role, your employment may be contingent upon adherence to local policy. This may include pre-placement drug screening, medical review of physical fitness for the role, and background checks. Show more Show less

Posted 4 days ago

Apply

3.0 years

0 Lacs

Mumbai Metropolitan Region

On-site

Linkedin logo

About Beco Beco ( letsbeco.com ) is a fast-growing Mumbai-based consumer-goods company on a mission to replace everyday single-use plastics with planet-friendly, bamboo- and plant-based alternatives. From reusable kitchen towels to biodegradable garbage bags, we make sustainable living convenient, affordable and mainstream. Our founding story began with a Mumbai beach clean-up that opened our eyes to the decades-long life of a single plastic wrapper—sparking our commitment to “Be Eco” every day. Our mission: “To craft, support and drive positive change with sustainable & eco-friendly alternatives—one Beco product at a time.” Backed by marquee climate-focused VCs and now 50 + employees, we are scaling rapidly across India’s top marketplaces, retail chains and D2C channels. Why we’re hiring Sustainability at scale demands operational excellence. As volumes explode, we need data-driven, self-learning systems that eliminate manual grunt work, unlock efficiency and delight customers. You will be the first dedicated AI/ML Engineer at Beco—owning the end-to-end automation roadmap across Finance, Marketing, Operations, Supply Chain and Sales. Responsibilities Partner with functional leaders to translate business pain-points into AI/ML solutions and automation opportunities. Own the complete lifecycle: data discovery, cleaning, feature engineering, model selection, training, evaluation, deployment and monitoring. Build robust data pipelines (SQL/BigQuery, Spark) and APIs to integrate models with ERP, CRM and marketing automation stacks. Stand up CI/CD + MLOps (Docker, Kubernetes, Airflow, MLflow, Vertex AI/SageMaker) for repeatable training and one-click releases. Establish data-quality, drift-detection and responsible-AI practices (bias, transparency, privacy). Mentor analysts & engineers; evangelise a culture of experimentation and “fail-fast” learning—core to Beco’s GSD (“Get Sh#!t Done”) values. Must-have Qualifications 3 + years hands-on experience delivering ML, data-science or intelligent-automation projects in production. Proficiency in Python (pandas, scikit-learn, PyTorch/TensorFlow) and SQL; solid grasp of statistics, experimentation and feature engineering. Experience building and scaling ETL/data pipelines on cloud (GCP, AWS or Azure). Familiarity with modern Gen-AI & NLP stacks (OpenAI, Hugging Face, RAG, vector databases). Track record of collaborating with cross-functional stakeholders and shipping iteratively in an agile environment. Nice-to-haves Exposure to e-commerce or FMCG supply-chain data. Knowledge of finance workflows (Reconciliation, AR/AP, FP&A) or RevOps tooling (HubSpot, Salesforce). Experience with vision models (Detectron2, YOLO) and edge deployment. Contributions to open-source ML projects or published papers/blogs. What Success Looks Like After 1 Year 70 % reduction in manual reporting hours across finance and ops. Forecast accuracy > 85 % at SKU level, slashing stock-outs by 30 %. AI chatbot resolves 60 % of tickets end-to-end, with CSAT > 4.7/5. At least two new data-products launched that directly boost topline or margin. Life at Beco Purpose-driven team obsessed with measurable climate impact. Entrepreneurial, accountable, bold” culture—where winning minds precede outside victories. Show more Show less

Posted 4 days ago

Apply

1.0 - 2.0 years

0 Lacs

Mumbai Metropolitan Region

On-site

Linkedin logo

About Us Growing tech company seeking a talented Data Analyst to build and lead our data initiatives. This role offers significant growth potential, including future team leadership opportunities. Key Responsibilities Analyze complex datasets and create actionable insights Build automated reporting systems and dashboards Identify trends and patterns to support business decisions Document processes and maintain data quality standards Collaborate with cross-functional teams to understand data needs Required Skills 1-2 years experience in data analysis Strong SQL proficiency Advanced Excel skills Statistical analysis expertise Clear communication and presentation abilities Experience with visualization tools (Tableau/Power BI) Preferred Skills Python programming Machine learning basics ETL processes knowledge Business intelligence tools Data warehouse concepts Personal Qualities Self-motivated problem solver Independent worker with team collaboration skills Proactive learner with technical curiosity Strong analytical mindset Leadership potential Education Bachelor's degree in Statistics, Mathematics, Computer Science, or related field Show more Show less

Posted 4 days ago

Apply

4.0 - 5.0 years

0 Lacs

Gurugram, Haryana, India

On-site

Linkedin logo

The job functions include, but are not limited to, the following:  Ability to conduct different types of testing (i.e., data validation, regression, etc.), including evaluating the testability of requirements.  Proficient in ETL Testing take ownership for consistently completing planned deployments on time and to scope.  Review Data Models, Data Mappings, and Architectural Documentation.  Define and track quality assurance metrics such as defects, defect counts, test results, and test status and suggest defect prevention plans.  Analyze and troubleshoot UAT / PROD defects to investigate the cause of the defect and follow up with developers and other team members to resolve issues and maintain a regression strategy.  Advocate the quality culture in cross-functional teams in the organization.  Work in an Agile way, attend Agile ceremonies, and share QA perspectives in the discussions.  Contribute in automation to reduce the manual testing effort.  Seek ongoing self-improvement through knowledge of advancements in test tools and techniques, to proactively improve Sage QA infrastructure and processes.  Prioritize own workload effectively. Qualifications, EDUCATION & Experience  4-5 years of Software Quality Assurance experience on enterprise-level applications.  Must have an experience in ETL/DWH/Data Migration/Database Testing.  Must have experience working with large data sets.  Azure cloud knowledge is a plus.  Should have experience in relational databases and database architecture. Have a good understanding data models and ER diagrams.  Developed complex SQL Scripts using JOINS, Subqueries, Analytical Functions, and String Functions to validate the Completeness, Integrity, and Accuracy of Data within an ETL process testing cycle.  Experience working in an Agile team.  Extensively involved in requirement analysis and identifying the scope of testing.  Experience in automation and writing Python scripts will be a plus. LANGUAGE, ANALYTICAL SKILLS AND PERSON SPECIFICATIONS  Highly motivated self-starter able to work to deadlines.  Good organization, written and verbal communication skills in English; able to communicate effectively with technical and non-technical audiences.  Strong Analytical & logical thinking.  Open and honest collaborative style.  Must be a responsive, flexible team player.  Strong attention to detail when testing and in documentation.  Process and quality orientated.  Excellent learning skills Show more Show less

Posted 4 days ago

Apply

15.0 years

0 Lacs

Noida, Uttar Pradesh, India

On-site

Linkedin logo

Description What makes us Qlik? A Gartner® Magic Quadrant™ Leader for 15 years in a row, Qlik transforms complex data landscapes into actionable insights, driving strategic business outcomes. Serving over 40,000 global customers, our portfolio leverages pervasive data quality and advanced AI/ML capabilities that lead to better decisions, faster. We excel in integration and governance solutions that work with diverse data sources, and our real-time analytics uncover hidden patterns, empowering teams to address complex challenges and seize new opportunities. The Principal Solution Architect Role Are you ready to take the lead in shaping the future of data and analytics? As a Principal Solution Architect, you’ll be the go-to technical expert, guiding some of the largest customers and partners in the India region. You’ll be at the forefront of demonstrating how cutting-edge data integration and analytics solutions can drive real business transformation. Collaborating closely with a dynamic Presales team in a flexible, agile environment, you’ll have the opportunity to showcase your expertise while working with Sales, Marketing, R&D, Product, Consulting, and Customer Success teams. If you're looking for a role that is engaging, fast-paced, and full of opportunities to make an impact, this is it. What makes this role interesting? Engage with high-profile customers and partners: Lead technical discussions and showcase innovative solutions to help organizations unlock the true power of their data. Drive business success with cutting-edge technology: Leverage Qlik’s next-generation data analytics and data integration platform to solve complex business challenges. Be at the forefront of industry trends: Stay ahead of the game by keeping up with the latest advancements in data analytics, as well as the competitive landscape. Collaborate with cross-functional teams: Work closely with internal teams and experts across Sales, Marketing, R&D, and Customer Success to build compelling solutions that resonate with customers. Flexibility and agility: Thrive in an environment that values adaptability, innovation, and dynamic thinking. Here’s How You’ll Be Making An Impact Own the technical sales cycle: Become a trusted advisor by guiding customers through technical evaluations, ensuring a seamless journey from exploration to adoption. Showcase innovation through tailored solutions: Deliver compelling presentations and custom demonstrations that address real customer needs and business challenges. Prove value through successful Proof-of-Concepts: Help customers experience the true power of Qlik’s platform by leading impactful proof-of-concept engagements. Support business development efforts: Play a key role in driving regional revenue growth by supporting strategic sales initiatives and expanding Qlik’s presence in the market. Position solutions for long-term success: Communicate effectively with stakeholders at all levels, from technical teams to senior leadership, ensuring alignment on the value and impact of Qlik’s solutions. We’re Looking For a Teammate With At least 8 years of experience in a presales and/or consulting capacity Strong experience in BI & analytics tools like Qlik Sense etc. Good understanding of SQL & data modeling. Good understanding of Machine Learning tools and its usage such as Python/R, and other AI/ML and Gen AI technologies Familiarity with cloud platforms and services Good to have Knowledge of Data Integration (ETL), Data Quality (DQ), Data Governance, iPaaS (APIs, micro services, Application Integration) will be plus Excellent communication skills to the business as well as technical audience Highly driven with strong interpersonal skills Track record of developing relationships at technical, commercial, and executive levels throughout large enterprises Ability to work independently and manage multiple complex opportunities. Travel Requirements Willingness and ability to travel approximately 25% Ability to travel internationally, if required The location for this role is: India – Delhi If you're passionate about helping businesses harness the full potential of their data and want to be part of a team that values expertise, innovation, and collaboration, this is your opportunity to make a real difference. Apply today! More About Qlik And Who We Are Find out more about life at Qlik on social: Instagram, LinkedIn, YouTube, and X/Twitter, and to see all other opportunities to join us and our values, check out our Careers Page. What else do we offer? Genuine career progression pathways and mentoring programs Culture of innovation, technology, collaboration, and openness Flexible, diverse, and international work environment Giving back is a huge part of our culture. Alongside an extra “change the world” day plus another for personal development, we also highly encourage participation in our Corporate Responsibility Employee Programs If you need assistance applying for a role due to a disability, please submit your request via email to accessibilityta @ qlik.com. Any information you provide will be treated according to Qlik’s Recruitment Privacy Notice. Qlik may only respond to emails related to accommodation requests. Qlik is not accepting unsolicited assistance from search firms for this employment opportunity. Please, no phone calls or emails. All resumes submitted by search firms to any employee at Qlik via-email, the Internet or in any form and/or method without a valid written search agreement in place for this position will be deemed the sole property of Qlik. No fee will be paid in the event the candidate is hired by Qlik as a result of the referral or through other means. Show more Show less

Posted 4 days ago

Apply

0 years

0 Lacs

New Delhi, Delhi, India

Remote

Linkedin logo

Company Description Muoro.io partners with organizations to build dedicated engineering teams and offshore development centers with top talent curated for specific objectives. The company focuses on addressing talent shortage and managing remote technology teams, particularly in emerging technologies. Role Description This is a remote contract role for an Azure Data Engineer at Muoro. The Azure Data Engineer will be responsible for designing and implementing data solutions using Azure services, building and maintaining data pipelines, and optimizing data workflows for efficiency. Qualifications Experience with Azure services, including Azure Data Factory, Azure Databricks, and Azure Synapse Analytics Proficiency in SQL and NoSQL databases Expertise in data modeling and ETL processes Strong analytical and problem-solving skills Experience with data visualization tools like Power BI or Tableau Bachelor's degree in Computer Science, Engineering, or a related field Show more Show less

Posted 4 days ago

Apply

2.0 years

0 Lacs

Jaipur, Rajasthan, India

On-site

Linkedin logo

Organizational Context: Company Name- CIMET CIMET - A Comparison Software Leader Website- www.cimet.com.au Product Link - https://www.econnex.com.au/ LinkedIn- https://www.linkedin.com/company/cimet/about CIMET provides end-to-end comparison and signup of energy, telecommunication, credit cards and other products plans through its online solution in a B2B and B2C environment. It presently caters to more than 40+ Utilities and Financial Service providers and has over 200+ partners. The online comparison market is extremely crowded in Australia, with most comparators providing a light touch directory solution. CIMET saw this as an opportunity and developed a fully integrated online comparison and signup platform. On 15 March 2022, CIMET partnered with iSelect who acquired a 49% stake in CIMET Holdings. Going forward, CIMET will use this investment from iSelect to expand into new products including Credit Cards, Home loans, Car loans, Personal loans, Life Insurance, Health Insurance, Pet Insurance among others. CIMET today has around 200+ team members, across Australia, India and Philippines and will double over the next 2 years. Purpose Of The Position: We are looking for a highly experienced Senior Data Scientist with a deep understanding of AI/ML technologies, including Recommendation Systems, Chatbots, Generative AI, and Large Language Models (LLMs). The ideal candidate will have hands-on experience in applying these technologies to solve real-world problems, working with large datasets, and collaborating with cross-functional teams to deliver innovative data-driven solutions.. Key Responsibilities : · Design, develop, and optimize recommendation systems to enhance user experience and engagement across platforms. · Build and deploy chatbots with advanced NLP capabilities for automating customer interactions and improving business processes. · Lead the development of Generative AI solutions, including content generation and automation. · Research and apply Large Language Models (LLMs) like GPT, BERT, and others to solve business-specific problems and create innovative solutions. · Collaborate with engineering teams to integrate machine learning models into production systems, ensuring scalability and reliability. · Perform data exploration, analysis, and feature engineering to improve model performance. · Stay updated on the latest advancements in AI and ML technologies, proposing new techniques and tools to enhance our product capabilities. · Mentor junior data scientists and engineers, providing guidance on best practices in AI/ML model development and deployment. · Collaborate with product managers and business stakeholders to translate business goals into AI-driven solutions. · Work on model interpretability, explainability, and ensure models are built in an ethical and responsible manner. Required Skills And Qualifications: · 5+ years of experience in data science or machine learning, with a focus on building and deploying AI models. · Strong expertise in designing and developing recommendation systems and working with collaborative filtering, matrix factorization, and content-based filtering techniques. · Hands-on experience with chatbots using Natural Language Processing (NLP) and conversational AI frameworks. · In-depth understanding of Generative AI, including transformer-based models and GANs (Generative Adversarial Networks). · Experience working with Large Language Models (LLMs) such as GPT, BERT, T5, etc. · Proficiency in machine learning frameworks such as TensorFlow, PyTorch, and Scikit-learn. · Strong programming skills in Python and libraries such as NumPy, Pandas, Hugging Face, and NLTK. · Experience with cloud platforms like AWS, GCP, or Azure for deploying and scaling machine learning models. · Solid understanding of data pipelines, ETL processes, and working with large datasets using SQL or NoSQL databases. · Knowledge of MLOps and experience deploying models in production environments. · Strong problem-solving skills and a deep understanding of statistical methods and algorithms. Preferred Qualifications: · Experience with Reinforcement Learning and Recommender Systems personalization techniques. · Experience of working with AWS Bedrock services. · Familiarity with ethical AI and model bias mitigation techniques. · Experience with A/B testing, experimentation, and performance tracking for AI models in production. · Prior experience mentoring junior data scientists and leading AI/ML projects. · Strong communication and collaboration skills, with the ability to convey complex technical concepts to non-technical stakeholders. Why Join Us? • Opportunity to be part of a rapidly growing, innovative product-based company. • Collaborate with a talented, driven team focused on building high-quality software solutions. • Competitive compensation and benefits package. Show more Show less

Posted 4 days ago

Apply

3.0 years

0 Lacs

Chennai, Tamil Nadu, India

On-site

Linkedin logo

Location: Chennai Looking for Short notice candidates Please share your updated profile to sugantha.krishnan@acldigital.com Responsibilities: - Collecting, cleaning, and preprocessing data from various sources - Developing and applying machine learning algorithms and statistical models to solve business problems - Conducting exploratory data analysis and data visualization - Collaborating with cross-functional teams to identify business opportunities and develop data-driven solutions - Developing and deploying data pipelines and ETL processes - Communicating insights and findings to technical and non-technical stakeholders - Evaluating the effectiveness and accuracy of models and recommending improvements - Keeping up-to-date with the latest trends and technologies in the field of data science Qualifications - Minimum of 3 years' Relevant experience with Bachelors' or Masters' degree in STEM (Science, Technology, Engineering, and Mathematics) - Work experience where data science, data engineering and application development was a major part of your work in automating software deployments and following a continuous delivery and deployment model - Experience with mathematical & statistical modeling, Cloud environment, APIs, Kubernetes - Solid experience in program coding with Python, SQL,or R Programming - Experience in visualization tools like Power BI - Experience in data science, machine learning or artificial intelligence technologies Show more Show less

Posted 4 days ago

Apply

8.0 years

0 Lacs

Bengaluru, Karnataka, India

On-site

Linkedin logo

Minimum qualifications: Bachelor's degree in Computer Science, a related technical field, or equivalent practical experience. 8 years of experience with software development in one or more programming languages (e.g., Python, C, C++, Java, JavaScript). 5 years of leadership experience and managing people. Preferred qualifications: Experience in data analytics, warehousing, ETL development, data science or other Big Data applications. About The Job A line of code can be many things - an amazing feature, a beautiful UI, a transformative algorithm. The faster this line of code reaches millions of users, the sooner it impacts their lives. As a Software Engineer, Tools and Infrastructure, you will be at the heart of Google’s engineering process building software that empowers engineering teams to develop and deliver high quality products quickly. We are focused on solving the hardest, most interesting challenges of developing software at scale without sacrificing stability, quality, velocity or code health. We ensure Google's success by partnering with engineering teams and developing scalable tools and infrastructure that help engineers develop, test, debug and release software quickly. We impact thousands of Googlers and billions of users by increasing the pace of product development and ensuring our products are thoroughly tested. We are advocates for code health, testability, maintainability and best practices for development and testing. Having access to all of Google's platforms and vast compute resources provides a unique opportunity to grow as an engineer. We typically work in small, nimble teams that collaborate on common problems across products and focus areas. As a result, the exposure to this broad set of problems provides technical challenges as well as accelerated career growth. Google Photos is a photo sharing and storage service developed by Google. Photos is one of the most sought after products at Google and is looking for both client-side (web and mobile), with server-side (search, storage, serving) and machine intelligence (learning, computer vision) Software Engineers. We are dedicated to making Google experiences centered around the user. Responsibilities Manage a team of Software Engineers to solve some of Photos and Google One's most critical analytics and experimentation challenges. Foster a collaborative partnership with stakeholders including Data Scientists, Analysts, and Strategy, Core partnerships with CoreData, to carve out and execute on long-term strategy. Build and enhance self-serve tools to help other teams create and manage data pipelines that generate metrics. Creating critical dashboards for visualization, and helping others create dashboards. Work on Analytics infrastructure, which also is a core infra in the ML life cycle. Google is proud to be an equal opportunity workplace and is an affirmative action employer. We are committed to equal employment opportunity regardless of race, color, ancestry, religion, sex, national origin, sexual orientation, age, citizenship, marital status, disability, gender identity or Veteran status. We also consider qualified applicants regardless of criminal histories, consistent with legal requirements. See also Google's EEO Policy and EEO is the Law. If you have a disability or special need that requires accommodation, please let us know by completing our Accommodations for Applicants form . Show more Show less

Posted 4 days ago

Apply

10.0 years

0 Lacs

Chennai, Tamil Nadu, India

On-site

Linkedin logo

Job Title/Role : .Net Technical Lead Job Location : Chennai Experience : 10+ Years Job Summary: We are seeking a highly skilled and experienced Technical Lead with a strong background in .NET technologies to lead a team of developers in designing, developing, and maintaining enterprise-level applications. The ideal candidate will have a passion for technology, excellent ownership skills, and a proven track record of delivering high-quality software solutions. Key Responsibilities: Lead the design, development, and deployment of .NET-based applications. Collaborate with cross-functional teams including product managers, QA, and DevOps. Provide technical guidance and mentorship to team members. Conduct code reviews and ensure adherence to best practices and coding standards. Architect scalable and maintainable solutions using .NET Core, C#, and related technologies. Troubleshoot and resolve complex technical issues. Stay updated with the latest industry trends and technologies. Participate in sprint planning, estimation, and agile ceremonies. Experience with data warehousing, ETL processes, and data modelling. Create Technical Design Documents and Data Flow Diagrams based on requirements. Required Skills & Qualifications: Bachelor’s or Master’s degree in Computer Science, Engineering, or related field. 10+ years of experience in software development with at least 2 years in a technical leadership role. Strong proficiency in C#, .NET Core, ASP.NET MVC/Web API . Experience with Entity Framework, LINQ, SQL Server . Familiarity with front-end technologies like Angular, React, or Blazor is a plus. Extensive experience with Microsoft Azure platform and services (App Services, Azure Functions, Azure SQL, Azure Data Factory, Azure Storage etc.). Experience with data warehousing, ETL processes, and data modelling. Experience with CI/CD pipelines , Azure DevOps , or Git . Strong understanding of OOP, design patterns , software architecture and security/api authentication Ability to evaluate architectural options (complexity, performance, high-availability, scalability, durability) and make right recommendation for implementation. Ensure robust logging, tracing, and security mechanisms are in place. Excellent communication, problem-solving, and team management skills. Preferred Qualifications: Experience with cloud platforms like Microsoft Azure or AWS . Experience migrating Azure/OnPrem solutions to AWS. Certifications in .NET, Azure or AWS are a plus Perks and Benefits: Competitive salaryand benefits Group MedicalInsurance ICICI Bank Multi Wallet Collaborative workspace Flexible workinghours Hybrid workingmodel What Makes Workingat OEC Awesome? We have a new OEC Technology Centre of Excellence in Chennai, India! Our team is beyond thrilled to work with the new office, but we’re even more excited for the innovation and creativity that this living space will certainly inspire! We believe in surrounding ourselves with not only the best and the brightest individuals, but those that are unique and purpose-driven in all that they do. OEC India has been selected as one of the ‘Top 25 Safest Workplaces in India’ by KelpHR. OEC provides equal employment opportunities (EEO) to all employees and applicants for employment without regard to race, colour, religion, creed, gender, sex (including pregnancy, childbirth, and related medical conditions), sexual orientation, gender identity, national origin, age, disability, genetic information or characteristics, marital status, familial status, veteran or military status, status regarding public assistance, membership or activity in a local commission, or any other protected status in accordance with applicable federal, state and local law. Show more Show less

Posted 4 days ago

Apply

5.0 years

0 Lacs

India

On-site

Linkedin logo

Maximize Your Impact with TP Welcome to TP, a global hub of innovation and empowerment, where we redefine the future. With a remarkable €10 billion annual revenue and a global team of 500,000 employees serving 170 countries in over 300 languages, we lead in intelligent, digital-first solutions. As a globally certified Great Place to Work in 72 countries, our culture thrives on diversity, equity, and inclusion. We value your unique perspective and believe that your talent is the missing piece that completes our vision for a brighter, digitally driven tomorrow. The Opportunity The AI Data Engineer designs, develops, and maintains robust data pipelines to support AI data services operations, ensuring smooth ingestion, transformation, and extraction of large, multilingual, and multimodal datasets. This role collaborates with cross-functional teams to optimize data workflows, implement quality checks, and deliver scalable solutions that underpin our analytics and AI/ML initiatives. The Responsibilities Create and manage ETL workflows using Python and relevant libraries (e.g., Pandas, NumPy) for high-volume data processing. Monitor and optimize data workflows to reduce latency, maximize throughput, and ensure high-quality data availability. Work with Platform Operations, QA, and Analytics teams to guarantee seamless data integration and consistent data accuracy. Implement validation processes and address anomalies or performance bottlenecks in real time. Develop REST API integrations and Python scripts to automate data exchanges with internal systems and BI dashboards. Maintain comprehensive technical documentation, data flow diagrams, and best-practice guidelines. The Qualifications Bachelor’s degree in Computer Science, Data Engineering, Information Technology, or a related field. Relevant coursework in Python programming, database management, or data integration techniques. 3–5 years of professional experience in data engineering, ETL development, or similar roles. Proven track record of building and maintaining scalable data pipelines. Experience working with SQL databases (e.g., MySQL, PostgreSQL) and NoSQL solutions (e.g., MongoDB). AWS Certified Data Analytics – Specialty, Google Cloud Professional Data Engineer, or similar certifications are a plus. Advanced Python proficiency with data libraries (Pandas, NumPy, etc.). Familiarity with ETL/orchestration tools (e.g., Apache Airflow). Understanding of REST APIs and integration frameworks. Experience with version control (Git) and continuous integration practices. Exposure to cloud-based data solutions (AWS, Azure, or GCP) is advantageous. Pre-Employment Screenings By TP policy, employment in this position will be contingent on your successful completion and passage of a comprehensive background check, including global sanctions and watch list screening. Important | Policy on Unsolicited Third-Party Candidate Submissions TP does not accept candidate submissions from unsolicited third parties, including recruiters or headhunters. Applications will not be considered, and no contractual association will be established through such submissions. Diversity, Equity & Inclusion At TP, we are committed to fostering a diverse, equitable, and inclusive workplace. We welcome individuals from all backgrounds and lifestyles and do not discriminate based on gender identity or expression, sexual orientation, race, religion, age, national origin, citizenship, disability, pregnancy status, veteran status, or other differences. Show more Show less

Posted 4 days ago

Apply

3.0 - 7.0 years

0 Lacs

Maharashtra, India

On-site

Linkedin logo

Wissen Technology is Hiring fo r SQL With Python About Wissen Technology: Wissen Technology is a globally recognized organization known for building solid technology teams, working with major financial institutions, and delivering high-quality solutions in IT services. With a strong presence in the financial industry, we provide cutting-edge solutions to address complex business challenges. Role Overview: We are looking for a skilled and detail-oriented candidate with a strong foundation in SQL, Python, and data processing techniques. The ideal candidate is passionate about transforming raw data into meaningful insights and has hands-on experience across the data pipeline—from data wrangling to visualization. Experience: 3 -7 Years Location: Bengaluru Required Skills: Strong experience with SQL (e.g., joins, subqueries, CTEs, window functions). Proficiency in Python for data manipulation (e.g., pandas, NumPy). Experience working with relational databases like MySQL, PostgreSQL, SQL Server, or Oracle . Hands-on experience in data wrangling , cleaning, and feature engineering. Understanding of ETL processes and tools . Familiarity with version control systems like Git. Knowledge of data visualization techniques and tools. Strong problem-solving and analytical skills. The Wissen Group was founded in the year 2000. Wissen Technology, a part of Wissen Group, was established in the year 2015. Wissen Technology is a specialized technology company that delivers high-end consulting for organizations in the Banking & Finance, Telecom, and Healthcare domains. We help clients build world class products. We offer an array of services including Core Business Application Development, Artificial Intelligence & Machine Learning, Big Data & Analytics, Visualization & Business Intelligence, Robotic Process Automation, Cloud Adoption, Mobility, Digital Adoption, Agile & DevOps, Quality Assurance & Test Automation. Over the years, Wissen Group has successfully delivered $1 billion worth of projects for more than 20 of the Fortune 500 companies. Wissen Technology provides exceptional value in mission critical projects for its clients, through thought leadership, ownership, and assured on-time deliveries that are always ‘first time right ’. The technology and thought leadership that the company commands in the industry is the direct result of the kind of people Wissen has been able to attract. Wissen is committed to providing them with the best possible opportunities and careers, which extends to providing the best possible experience and value to our clients. We have been certified as a Great Place to Work® company for two consecutive years (2020-2022) and voted as the Top 20 AI/ML vendor by CIO Insider. Great Place to W ork® Certification is recognized world over by employees and employers alike and is considered the ‘Gold Standard ’. Wissen Technology has created a Great Place to Work by excelling in all dimensions - High-Trust, High-Performance Culture, Credibility, Respect, Fairness, Pride and Camaraderie. Website : www.wissen.com LinkedIn : https ://www.linkedin.com/company/wissen-technology Wissen Leadership : https://www.wissen.com/company/leadership-team/ Wissen Live: https://www.linkedin.com/company/wissen-technology/posts/feedView=All Wissen Thought Leadership : https://www.wissen.com/articles/ Employee Speak: https://www.ambitionbox.com/overview/wissen-technology-overview https://www.glassdoor.com/Reviews/Wissen-Infotech-Reviews-E287365.htm Great Place to Work: https://www.wissen.com/blog/wissen-is-a-great-place-to-work-says-the-great-place-to-work-institute-india/ https://www.linkedin.com/posts/wissen-infotech_wissen-leadership-wissenites-activity-6935459546131763200-xF2k About Wissen Interview Process : https://www.wissen.com/blog/we-work-on-highly-complex-technology-projects-here-is-how-it-changes-whom-we-hire/ Latest in Wissen in CIO Insider: https://www.cioinsiderindia.com/vendor/wissen-technology-setting-new-benchmarks-in-technology-consulting-cid-1064.html Show more Show less

Posted 4 days ago

Apply

0 years

0 Lacs

Chennai, Tamil Nadu, India

On-site

Linkedin logo

P1-C3-STS Seeking a developer who has good Experience in Athena, Python code, Glue, Lambda, DMS , RDS, Redshift Cloud Formation and other AWS serverless resources. Can optimize data models for performance and efficiency. Able to write SQL queries to support data analysis and reporting Design, implement, and maintain the data architecture for all AWS data services. Work with stakeholders to identify business needs and requirements for data-related projects Design and implement ETL processes to load data into the data warehouse Good Experience in Athena, Python code, Glue, Lambda, DMS , RDS, Redshift, Cloud Formation and other AWS serverless resources Cloud Formation and other AWS serverless resources Show more Show less

Posted 4 days ago

Apply

3.0 - 6.0 years

0 Lacs

Hyderabad, Telangana, India

On-site

Linkedin logo

Position Summary Senior Analyst – Data Engineer - Deloitte Technology - Deloitte Support Services India Private Limited Do you thrive on developing creative and innovative insights to solve complex challenges? Want to work on next-generation, cutting-edge products and services that deliver outstanding value and that are global in vision and scope? Work with premier thought leaders in your field? Work for a world-class organization that provides an exceptional career experience with an inclusive and collaborative culture? Work you’ll do Seeking a candidate with extensive experience on designing, delivering and maintaining implementations of solutions in the cloud, specifically Microsoft Azure. This candidate should also possess strong cross-discipline communication skills, strong analytical aptitude with critical thinking, a solid understanding of how data would translate into reporting / dashboarding capabilities, and the tools and platforms that support them. Responsibilities Role Specific Designing a well-structured data model using methodologies (e.g., Kimball or Inmon) that accurately represents the business requirements, ensures data integrity and minimizes redundancies. Developing and implementing data pipelines to extract, transform, and load (ETL) data from various sources into Azure data services. This includes using Azure Data Factory, Azure Databricks, or other tools to orchestrate data workflows and data movement. Build, Test and Run of data assets tied to tasks and user stories from the Azure DevOps instance of Enterprise Data & Analytics. Bring a level of technical expertise of the Big Data space that contributes to the strategic roadmaps for Enterprise Data Architecture, Global Data Cloud Architecture, and Global Business Intelligence Architecture, as well contributes to the development of the broader Enterprise Data & Analytics Engineering community Actively participate in regularly scheduled contact calls to transparently review the status of in-flight projects, priorities of backlog projects, and review adoption of previous deliveries from Enterprise Data & Analytics with the Data Insights team. Handle break fixes and participate in a rotational on-call schedule. On-call includes monitoring of scheduled jobs and ETL pipelines. Actively participate in team meetings to transparently review the status of in-flight projects and their progress. Follow standard practice and frameworks on each project from development, to testing and then productionizing, each within the appropriate environment laid out by Data Architecture. Challenge’s self and others to make an impact that matters and help team connect their contributions with broader purpose. Sets expectations to the team, aligns the work based on the strengths and competencies, and challenges them to raise the bar while providing the support. Extensive knowledge of multiple technologies, tools, and processes to improve the design and architecture of the assigned applications. Knowledge Sharing / Documentation Contribute to, produce, and maintain processes, procedures, operational and architectural documentation. Change Control - ensure compliance with Processes and adherence to standards and documentation. Work with Deloitte Technology leadership and service teams in reviewing documentation and aligning KPIs to critical steps in our service operations. Active participation in ongoing training within BI space. The team At Deloitte, we’re all about collaboration. And nowhere is this more apparent than among our 2,000-strong internal services team. With our combined specialist skills, we provide all the essential support and advice our client-facing colleagues need, right across the firm. This enables them to focus all of their efforts on delivering the best service possible to their clients. Covering seven distinct areas; Human Resources, Clients & Industries, Finance & Legal, Practice Support Services, Quality & Risk Services, IT Services, and Workplace Services & Real Estate, together we live, breathe and deliver the Deloitte experience. Location: Hyderabad Work shift Timings: 11 AM to 8 PM Qualifications Bachelor of Engineering/ Bachelor of Technology 3-6 years of broad-based IT experience with technical knowledge of Microsoft SQL Server, Azure SQL Data Warehouse, Azure Data Lake Store, Azure Data Factory Demonstrated experience in Apache Framework (Spark, Scala, etc.) Well versed in SQL and comfortable in scripting using Python or similar language. First Month Critical Outcomes: Absorb strategic projects from the backlog and complete the related Azure SQL Data Warehouse Development work. Inspect existing run-state SQL Server databases and Azure SQL Data Warehouses and identify optimizations for potential development. Deliver new databases assigned as needed. Integration to on-call rotation (First 90 days). Contribute to legacy content and architecture migration to data lake (First 90 days). Delivery of first 2 data ingestion pipelines to include ingestion, QA and automation using Azure Big Data tools (First 90 days). Ability to document all work following standard documentation practices set forth by Data Governance (First 90 days). How You’ll Grow At Deloitte, we’ve invested a great deal to create a rich environment in which our professionals can grow. We want all our people to develop in their own way, playing to their own strengths as they hone their leadership skills. And, as a part of our efforts, we provide our professionals with a variety of learning and networking opportunities—including exposure to leaders, sponsors, coaches, and challenging assignments—to help accelerate their careers along the way. No two people learn in exactly the same way. So, we provide a range of resources including live classrooms, team-based learning, and eLearning. DU: The Leadership Center in India, our state-of-the-art, world-class learning Center in the Hyderabad offices is an extension of the Deloitte University (DU) in Westlake, Texas, and represents a tangible symbol of our commitment to our people’s growth and development. Explore DU: The Leadership Center in India Benefits At Deloitte, we know that great people make a great organization. We value our people and offer employees a broad range of benefits. Learn more about what working at Deloitte can mean for you. Deloitte’s culture Our positive and supportive culture encourages our people to do their best work every day. We celebrate individuals by recognizing their uniqueness and offering them the flexibility to make daily choices that can help them to be healthy, centered, confident, and aware. We offer well-being programs and are continuously looking for new ways to maintain a culture that is inclusive, invites authenticity, leverages our diversity, and where our people excel and lead healthy, happy lives. Learn more about Life at Deloitte. Corporate citizenship Deloitte is led by a purpose: to make an impact that matters. This purpose defines who we are and extends to relationships with our clients, our people and our communities. We believe that business has the power to inspire and transform. We focus on education, giving, skill-based volunteerism, and leadership to help drive positive social impact in our communities. Learn more about Deloitte’s impact on the world. #EAG-Technology Recruiting tips From developing a stand out resume to putting your best foot forward in the interview, we want you to feel prepared and confident as you explore opportunities at Deloitte. Check out recruiting tips from Deloitte recruiters. Benefits At Deloitte, we know that great people make a great organization. We value our people and offer employees a broad range of benefits. Learn more about what working at Deloitte can mean for you. Our people and culture Our inclusive culture empowers our people to be who they are, contribute their unique perspectives, and make a difference individually and collectively. It enables us to leverage different ideas and perspectives, and bring more creativity and innovation to help solve our clients' most complex challenges. This makes Deloitte one of the most rewarding places to work. Our purpose Deloitte’s purpose is to make an impact that matters for our people, clients, and communities. At Deloitte, purpose is synonymous with how we work every day. It defines who we are. Our purpose comes through in our work with clients that enables impact and value in their organizations, as well as through our own investments, commitments, and actions across areas that help drive positive outcomes for our communities. Professional development From entry-level employees to senior leaders, we believe there’s always room to learn. We offer opportunities to build new skills, take on leadership opportunities and connect and grow through mentorship. From on-the-job learning experiences to formal development programs, our professionals have a variety of opportunities to continue to grow throughout their career. Requisition code: 304653 Show more Show less

Posted 4 days ago

Apply

0 years

0 Lacs

Chennai, Tamil Nadu, India

On-site

Linkedin logo

Role: ETL Tester Work Mode: Hybrid Work timings: 2pm to 11pm Location: Chennai & Hyderabad Primary Skills: ETL Examining the business needs to determine the testing technique by automation testing. Maintenance of present regression suites and test scripts is an important responsibility of the tester. The testers must attend agile meetings for backlog refinement, sprint planning, and daily scrum meetings. Testers to execute regression suites for better results. Must provide results to developers, project managers, stakeholders, and manual testers. Develop and execute test plans, test cases, and test scripts for ETL processes. Validate data extraction, transformation, and loading workflows Analyze test results and provide detailed reports to stakeholders. Automate repetitive testing tasks to improve efficiency. Strong SQL base to validate the transformations. Skill Proficiency Level expected Strong ETL Testing Strong SQL - In depth understanding of SQL queries and applying it in QA Testing. Show more Show less

Posted 4 days ago

Apply

0.0 years

0 Lacs

Chennai, Tamil Nadu, India

On-site

Linkedin logo

Company Overview KLA is a global leader in diversified electronics for the semiconductor manufacturing ecosystem. Virtually every electronic device in the world is produced using our technologies. No laptop, smartphone, wearable device, voice-controlled gadget, flexible screen, VR device or smart car would have made it into your hands without us. KLA invents systems and solutions for the manufacturing of wafers and reticles, integrated circuits, packaging, printed circuit boards and flat panel displays. The innovative ideas and devices that are advancing humanity all begin with inspiration, research and development. KLA focuses more than average on innovation and we invest 15% of sales back into R&D. Our expert teams of physicists, engineers, data scientists and problem-solvers work together with the world’s leading technology providers to accelerate the delivery of tomorrow’s electronic devices. Life here is exciting and our teams thrive on tackling really hard problems. There is never a dull moment with us. Group/Division The Information Technology (IT) group at KLA is involved in every aspect of the global business. IT’s mission is to enable business growth and productivity by connecting people, process, and technology. It focuses not only on enhancing the technology that enables our business to thrive but also on how employees use and are empowered by technology. This integrated approach to customer service, creativity and technological excellence enables employee productivity, business analytics, and process excellence. Job Description/Preferred Qualifications As a Sr. data engineer, part of Data Sciences and Analytics team, you will play a key role for KLA’s Data strategy principles and technique. As part of the centralized analytics team we will help analyze and find key data insights into various business unit processes across the company. You will be providing key performance indicators, dashboards to help various business users/partners to make business critical decisions. You will craft and develop analytical solutions by capturing business requirements and translating them into technical specifications, building data models and data visualizations. Responsibilities: Design, develop and deploy Microsoft Fabric solutions, Power BI reports and dashboards. Collaborate with business stakeholders to gather requirements and translate them into technical specifications. Develop data models and establish data connections to various data sources. Use expert knowledge on Microsoft fabric architecture, deployment, and management. Optimize Power BI solutions for performance and scalability. Implement best practices for data visualization and user experience. Conduct code reviews and provide mentorship to junior developers. Manage permissions and workspaces in Power BI ensuring secure and efficient analytics platform. Conduct assessments and audits of existing Microsoft Fabric environments to identify areas for improvement. Stay current with the latest Fabric and Power BI features and updates. Troubleshoot and resolve issues related to Fabric objects, Power BI reports and data sources. Create detailed documentation, including design specifications, implementation plans, and user guides. Minimum Qualifications Doctorate (Academic) Degree and 0 years related work experience; Master's Level Degree and related work experience of 3 years; Bachelor's Level Degree and related work experience of 5 years Proven experience as a Power BI Developer, with a strong portfolio of Power BI projects. In-depth knowledge of Power BI, including DAX, Power Query, and data modeling. Experience with SQL and other data manipulation languages. In-depth knowledge of Microsoft Fabric and Power BI, including its components and capabilities. Strong understanding of Azure cloud computing, data integration, and data management. Excellent problem-solving skills and the ability to work independently and as part of a team. Excellent Technical Problem Solving skill, performance optimization skills Specialist in SQL and Stored procedures with Data warehouse concepts Performed ETL Processes (Extract, Load, Transform). Exceptional communication and interpersonal skills. Expert knowledge of cloud and big data concepts and tools. – Azure, AWS, Data Lake, Snowflake, etc. Nice to have: Extremely strong SQL skills Foundational knowledge of the following areas: Metadata management Master Data Management Data Governance Data Analytics Good to have technical knowledge on Data Bricks/Data Lake/Spark/SQL. Experience in configuring SSO(Single Single-On), RBAC, security roles on an analytics platform. SAP functional knowledge is a plus Microsoft certifications related to Microsoft Fabric/Power BI or Azure/analytics are a plus. Good understanding of requirements and converting them into data warehouse solutions We offer a competitive, family friendly total rewards package. We design our programs to reflect our commitment to an inclusive environment, while ensuring we provide benefits that meet the diverse needs of our employees. KLA is proud to be an equal opportunity employer Be aware of potentially fraudulent job postings or suspicious recruiting activity by persons that are currently posing as KLA employees. KLA never asks for any financial compensation to be considered for an interview, to become an employee, or for equipment. Further, KLA does not work with any recruiters or third parties who charge such fees either directly or on behalf of KLA. Please ensure that you have searched KLA’s Careers website for legitimate job postings. KLA follows a recruiting process that involves multiple interviews in person or on video conferencing with our hiring managers. If you are concerned that a communication, an interview, an offer of employment, or that an employee is not legitimate, please send an email to talent.acquisition@kla.com to confirm the person you are communicating with is an employee. We take your privacy very seriously and confidentially handle your information. Show more Show less

Posted 4 days ago

Apply

5.0 years

0 Lacs

Pune, Maharashtra, India

On-site

Linkedin logo

We are seeking a skilled Database Migration Specialist with deep expertise in mainframe modernization and data migration to cloud platforms such as AWS, Azure, or GCP . The ideal candidate will have hands-on experience migrating legacy systems (COBOL, DB2, IMS, VSAM, etc.) to modern cloud-native databases like PostgreSQL, Oracle, or NoSQL . What will your job look like? Lead and execute end-to-end mainframe-to-cloud database migration projects. Analyze legacy systems (z/OS, Unisys) and design modern data architectures. Extract, transform, and load (ETL) complex datasets ensuring data integrity and taxonomy alignment. Collaborate with cloud architects and application teams to ensure seamless integration. Optimize performance and scalability of migrated databases. Document migration processes, tools, and best practices. Required Skills & Experience 5+ years in mainframe systems (COBOL, CICS, DB2, IMS, JCL, VSAM, Datacom). Proven experience in cloud migration (AWS DMS, Azure Data Factory, GCP Dataflow, etc.). Strong knowledge of ETL tools , data modeling, and schema conversion. Experience with PostgreSQL, Oracle, or other cloud-native databases . Familiarity with data governance , security, and compliance in cloud environments. Excellent problem-solving and communication skills. Show more Show less

Posted 4 days ago

Apply

130.0 years

6 - 9 Lacs

Hyderābād

On-site

Job Description Our company is an innovative, global healthcare leader that is committed to improving health and well-being around the world with a diversified portfolio of prescription medicines, vaccines and animal health products. We continue to focus our research on conditions that affect millions of people around the world - diseases like Alzheimer's, diabetes and cancer - while expanding our strengths in areas like vaccines and biologics. Our ability to excel depends on the integrity, knowledge, imagination, skill, diversity and teamwork of an individual like you. To this end, we strive to create an environment of mutual respect, encouragement and teamwork. As part of our global team, you’ll have the opportunity to collaborate with talented and dedicated colleagues while developing and expanding your career. As a Digital Supply Chain Data Modeler/Engineer, you will work as a member of the Digital Manufacturing Division team supporting Enterprise Orchestration Platform. You will be responsible for identifying, assessing, and solving complex business problems related to manufacturing and supply chain. You will receive training to achieve this, and you’ll be amazed at the diversity of opportunities to develop your potential and grow professionally. You will collaborate with business stakeholders and determine analytical capabilities that will enable the creation of Insights-focused solutions that align to business needs and ensure that delivery of these solutions meet quality requirements. The Opportunity Based in Hyderabad, joining a global healthcare biopharma company and be part of a 130- year legacy of success backed by ethical integrity, forward momentum, and an inspiring mission to achieve new milestones in global healthcare. Be part of an organization driven by digital technology and data-backed approaches that support a diversified portfolio of prescription medicines, vaccines, and animal health products. Drive innovation and execution excellence. Be a part of a team with passion for using data, analytics, and insights to drive decision-making, and which creates custom software, allowing us to tackle some of the world's greatest health threats. Our Technology Centers focus on creating a space where teams can come together to deliver business solutions that save and improve lives. An integral part of our company’s IT operating model, Tech Centers are globally distributed locations where each IT division has employees to enable our digital transformation journey and drive business outcomes. These locations, in addition to the other sites, are essential to supporting our business and strategy. A focused group of leaders in each Tech Centre helps to ensure we can manage and improve each location, from investing in growth, success, and well-being of our people, to making sure colleagues from each IT division feel a sense of belonging to managing critical emergencies. And together, we must leverage the strength of our team to collaborate globally to optimize connections and share best practices across the Tech Centers. Job Description As Data modeler lead, you will be responsible for following Deliver divisional analytics initiatives with primary focus on datamodeling for all analytics, advanced analytics and AI/ML uses cases e,g Self Services , Business Intelligence & Analytics, Data exploration, Data Wrangling etc. Host and lead requirement/process workshop to understand the requirements of datamodeling . Analysis of business requirements and work with architecture team to deliver & contribute to feasibility analysis, implementation plans and high-level estimates. Based on business process and analysis of data sources, deliver detailed ETL design with mapping of data model covering all areas of Data warehousing for all analytics use cases . Creation of data model & transformation mapping in modeling tool and deploy in databases including creation of schedule orchestration jobs . Deployment of Data modeling configuration to Target systems (SIT , UAT & Prod ) . Understanding of Product owership and management. Lead Data model as a product for focus areas of Digital supply chain domain. Creation of required SDLC documentation as per project requirements. Optimization/industrialization of existing database and data transformation solution Prepare and update Data modeling and Data warehousing best practices along with foundational platforms. Work very closely with foundational product teams, Business, vendors and technology support teams to build team to deliver business initiatives . Position Qualifications : Education Minimum Requirement: - B.S. or M.S. in IT, Engineering, Computer Science, or related field. Required Experience and Skills**: 5+ years of relevant work experience, with demonstrated expertise in Data modeling in DWH, Data Mesh or any analytics related implementation; experience in implementing end to end DWH solutions involving creating design of DWH and deploying the solution 3+ years of experience in creating logical & Physical data model in any modeling tool ( SAP Power designer, WhereScape etc ). Experience in creating data modeling standards, best practices and Implementation process. High Proficiency in Information Management, Data Analysis and Reporting Requirement Elicitation Experience working with extracting business rules to develop transformations, data lineage, and dimension data modeling Experience working with validating legacy and developed data model outputs Development experience using WhereScape and various similar ETL/Data Modeling tools Exposure to Qlik or similar BI dashboarding applications Has advanced knowledge of SQL and data transformation practices Has deep understanding of data modelling and preparation of optimal data structures Is able to communicate with business, data transformation team and reporting team Has knowledge of ETL methods, and a willingness to learn ETL technologies Can fluently communicate in English Experience in Redshift or similar databases using DDL, DML, Query optimization, Schema management, Security, etc Experience with Airflow or similar various orchestration tools Exposure to CI/CD tools Exposure to AWS modules such as S3, AWS Console, Glue, Spectrum, etc management Independently support business discussions, analyze, and develop/deliver code Preferred Experience and Skills: Experience working on projects where Agile methodology is leveraged Understanding of data management best practices and data analytics Ability to lead requirements sessions with clients and project teams Strong leadership, verbal and written communication skills with ability to articulate results and issues to internal and client teams Demonstrated experience in the Life Science space Exposure to SAP and Rapid Response domain data is a plus Current Employees apply HERE Current Contingent Workers apply HERE Search Firm Representatives Please Read Carefully Merck & Co., Inc., Rahway, NJ, USA, also known as Merck Sharp & Dohme LLC, Rahway, NJ, USA, does not accept unsolicited assistance from search firms for employment opportunities. All CVs / resumes submitted by search firms to any employee at our company without a valid written search agreement in place for this position will be deemed the sole property of our company. No fee will be paid in the event a candidate is hired by our company as a result of an agency referral where no pre-existing agreement is in place. Where agency agreements are in place, introductions are position specific. Please, no phone calls or emails. Employee Status: Regular Relocation: VISA Sponsorship: Travel Requirements: Flexible Work Arrangements: Hybrid Shift: Valid Driving License: Hazardous Material(s): Required Skills: Agile Data Warehousing, Agile Methodology, Animal Vaccination, Business, Business Communications, Business Initiatives, Business Intelligence (BI), Computer Science, Database Administration, Data Engineering, Data Management, Data Modeling, Data Visualization, Data Warehousing (DW), Design Applications, Digital Supply Chain, Digital Supply Chain Management, Digital Transformation, Information Management, Information Technology Operations, Software Development, Software Development Life Cycle (SDLC), Supply Chain Optimization, Supply Management, System Designs Preferred Skills: Job Posting End Date: 07/31/2025 A job posting is effective until 11:59:59PM on the day BEFORE the listed job posting end date. Please ensure you apply to a job posting no later than the day BEFORE the job posting end date. Requisition ID: R352794

Posted 4 days ago

Apply

3.0 - 20.0 years

0 Lacs

Hyderābād

On-site

We're Hiring – Sr. Data Engineer or Architech (Hyderabad, India) Altura Innovative Technologies Pvt Ltd is looking for a Data Engineer with expertise in PySpark, Python, and DSA to join our client team! Experience: 3 to 20 years Key Skills: PySpark, Python , Data Structures & Algorithms, ETL, SQL, Cloud (AWS/Azure/GCP) or Palanti Foundry Foundry or GIS (Must and should) Location: Hyderabad, India (Onsite – 5 days per week) Work Timings: 2 PM – 10 PM IST If you're passionate about big data, scalable pipelines, and cloud technologies, we want to hear from you! Share your updated resume to careers@alturaitech.com or 8179033240 Job Types: Full-time, Permanent Pay: ₹500,000.00 - ₹3,000,000.00 per year Benefits: Health insurance Paid time off Provident Fund Schedule: Day shift UK shift Work Location: In person

Posted 4 days ago

Apply

0 years

0 Lacs

Pune, Maharashtra, India

On-site

Linkedin logo

Job Description Experience required- 7 to 9 Primary Skill: Java, Spring, Spring-Boot & Ensure a high-quality and sustainable code base by establishing Test Driven Development (TDD) as the foundation of development and Cloud development (AWS/Azure). Secondary Skill : REST API's, Knowledge of Big Data Tools- ETL process, Data ingestion & React Ensure reliable and quality deployments using CI/CD. Ensure meeting SLA's & service uptime. Help establish a DevSecOps culture on the delivery team by implementing Secure DevOps principles. Drive product innovations leveraging the cloud and other industry-leading technologies. Root cause, debug, and fix simple application issues and prevent recurrence initiatives within the Product's code base. Show more Show less

Posted 4 days ago

Apply

0 years

5 Lacs

Hyderābād

On-site

Job description Some careers shine brighter than others. If you’re looking for a career that will help you stand out, join HSBC and fulfil your potential. Whether you want a career that could take you to the top, or simply take you in an exciting new direction, HSBC offers opportunities, support and rewards that will take you further. HSBC is one of the largest banking and financial services organisations in the world, with operations in 64 countries and territories. We aim to be where the growth is, enabling businesses to thrive and economies to prosper, and, ultimately, helping people to fulfil their hopes and realise their ambitions. We are currently seeking an experienced professional to join our team in the role of Consultant Specialist In this role, you will: Co-ordinations with stakeholders to ensure timely deliverables. Provide solution architecture support to projects where required ensuring that solution defined meets business needs, is aligned to functional and target architecture with any deviations approved. Analyze and propose plan to demise legacy systems. Lead a team of data engineers and assume responsibilities as Technical Lead for the assigned projects. Ensure full ownership and efficient management of the GDT IT services and products. Ensure that any new technology products are taken through the technology design governance process. Mentor and coach less experienced members of staff and promotes an understanding of the value of architecture and of use of technologies and standards in their domain across IT. Periodical monitor of team progress. Delivering optimum solution that meets client requirements. Inputs provided for Estimations, Monitoring & Co-ordinate team related activities. Involved in Designing, Development & Unit testing, Performance Testing the application Requirements To be successful in this role, you should meet the following requirements: Deep knowledge of Cloud Architecture on GCP, AWS or Azure with preference to GCP Proficient in Google Cloud and Cloud Native technologies Ability to work with senior stakeholders and various business parties and drive all the business discussions. A track record of making complex business decisions with authority, even in times of ambiguity, considering the potential long term risks and implications. DevOps and automation design experience Experience with ETL tools like IBM DataStage Experience in Hadoop / Hive Experience in Solution Architecture. Relevant experience in Big query, Google analytics, Data Flow, Pub/Sub, Cloud Sql, Qliksense, Spark Experience with Design, build and configure applications to meet business process requirements. Experience with On-Prem to GCP data migration projects Concepts RDBMS, SDLC, OLAP, Logical/Physical Dimensional Programming Languages PL-SQL, SQL , Unix Scripting, Java, Python Operating Systems Windows 2000/NT, Unix, Linux" Must have experience working in Agile environment and should be well versed with Agile/Scrum Master Experience working in building business intelligence solutions You’ll achieve more when you join HSBC. www.hsbc.com/careers HSBC is committed to building a culture where all employees are valued, respected and opinions count. We take pride in providing a workplace that fosters continuous professional development, flexible working and opportunities to grow within an inclusive and diverse environment. Personal data held by the Bank relating to employment applications will be used in accordance with our Privacy Statement, which is available on our website. Issued by – HSDI

Posted 4 days ago

Apply

5.0 - 8.0 years

0 Lacs

Gurugram, Haryana, India

On-site

Linkedin logo

Job Title: Senior PySpark Data Engineer Share only the quality profiles Location: Pune/Hybrid Experience: 5-8 years Budget: 8LPA-11 LPA Notice Period: Immediate to 15 days Mandatory Skills: · Python · SQL · ETL · Informatica PowerCenter · AWS/Azure Good to Have: · IDMC Tech Stack Table Skills Experience Rating out of 10 Python SQL ETL Informatica PowerCenter Aws/Azure Job Summary: We are seeking a Senior PySpark Data Engineer with extensive experience in developing, optimizing, and maintaining data processing jobs using PySpark. The ideal candidate will possess a robust background in SQL and ETL processes, along with proficiency in cloud platforms such as AWS or Azure. This role will require excellent analytical skills and the ability to communicate effectively with both technical and non-technical stakeholders. Key Responsibilities: · Design, develop, and optimize PySpark jobs for enhanced performance and scalability. · Collaborate with data architects and business analysts to understand data requirements and translate them into technical specifications. · Redesign and maintain complex SQL queries and stored procedures to support data extraction and transformation processes. · Utilize ETL tools, specifically Informatica PowerCenter, to build effective data pipelines. · Troubleshoot and resolve data quality issues and performance bottlenecks. · Mentor and provide technical guidance to a team of developers to enhance productivity and code quality. · Stay updated with new technologies and practices to continually improve data processing capabilities. Qualifications: · Education: Bachelor’s or Master’s degree in Computer Science, Information Technology, or a related field. · Experience: 5-8 years of experience in data engineering, with a strong focus on PySpark and ETL processes. Technical Skills: Must-Have: 1. Extensive experience with PySpark, focusing on job optimization techniques. 2. Proficiency in SQL, with experience in SQL Server, MySQL, or other relational databases. 3. Strong knowledge of ETL concepts and tools, particularly Informatica PowerCenter and IDMC. 4. Excellent analytical and troubleshooting skills. 5. Strong communication skills for effective collaboration. Good to Have: 1. Basic knowledge of Unix commands and Shell scripting. 2. Experience in leading and mentoring development teams. 3. Familiarity with Azure/Fabric. Kindly share a profile only in this tracker format ,attach the tracker to the body of the mail. Without this tracker format and Tech Stack Table profile will not be considered. S.no Date Position Names of the Candidate Mobile Number Email id Total Experience Relevant Experience CUrrent CTC Expected CTC Notice Period / On Paper Current Organisation Current Location Address with Pin code Reason of leaving DOB Offer in hand VENDOR NAME - Regards Damodar 91-8976334593 info@d-techworks.com D-TechWorks Pvt Ltd USA | INDIA www.d-techworks.com Information Technology Services Technology | Consulting | Development | Staff Augmentation Show more Show less

Posted 4 days ago

Apply

Exploring ETL Jobs in India

The ETL (Extract, Transform, Load) job market in India is thriving with numerous opportunities for job seekers. ETL professionals play a crucial role in managing and analyzing data effectively for organizations across various industries. If you are considering a career in ETL, this article will provide you with valuable insights into the job market in India.

Top Hiring Locations in India

  1. Bangalore
  2. Mumbai
  3. Pune
  4. Hyderabad
  5. Chennai

These cities are known for their thriving tech industries and often have a high demand for ETL professionals.

Average Salary Range

The average salary range for ETL professionals in India varies based on experience levels. Entry-level positions typically start at around ₹3-5 lakhs per annum, while experienced professionals can earn upwards of ₹10-15 lakhs per annum.

Career Path

In the ETL field, a typical career path may include roles such as: - Junior ETL Developer - ETL Developer - Senior ETL Developer - ETL Tech Lead - ETL Architect

As you gain experience and expertise, you can progress to higher-level roles within the ETL domain.

Related Skills

Alongside ETL, professionals in this field are often expected to have skills in: - SQL - Data Warehousing - Data Modeling - ETL Tools (e.g., Informatica, Talend) - Database Management Systems (e.g., Oracle, SQL Server)

Having a strong foundation in these related skills can enhance your capabilities as an ETL professional.

Interview Questions

Here are 25 interview questions that you may encounter in ETL job interviews:

  • What is ETL and why is it important? (basic)
  • Explain the difference between ETL and ELT processes. (medium)
  • How do you handle incremental loads in ETL processes? (medium)
  • What is a surrogate key in the context of ETL? (basic)
  • Can you explain the concept of data profiling in ETL? (medium)
  • How do you handle data quality issues in ETL processes? (medium)
  • What are some common ETL tools you have worked with? (basic)
  • Explain the difference between a full load and an incremental load. (basic)
  • How do you optimize ETL processes for performance? (medium)
  • Can you describe a challenging ETL project you worked on and how you overcame obstacles? (advanced)
  • What is the significance of data cleansing in ETL? (basic)
  • How do you ensure data security and compliance in ETL processes? (medium)
  • Have you worked with real-time data integration in ETL? If so, how did you approach it? (advanced)
  • What are the key components of an ETL architecture? (basic)
  • How do you handle data transformation requirements in ETL processes? (medium)
  • What are some best practices for ETL development? (medium)
  • Can you explain the concept of change data capture in ETL? (medium)
  • How do you troubleshoot ETL job failures? (medium)
  • What role does metadata play in ETL processes? (basic)
  • How do you handle complex transformations in ETL processes? (medium)
  • What is the importance of data lineage in ETL? (basic)
  • Have you worked with parallel processing in ETL? If so, explain your experience. (advanced)
  • How do you ensure data consistency across different ETL jobs? (medium)
  • Can you explain the concept of slowly changing dimensions in ETL? (medium)
  • How do you document ETL processes for knowledge sharing and future reference? (basic)

Closing Remarks

As you explore ETL jobs in India, remember to showcase your skills and expertise confidently during interviews. With the right preparation and a solid understanding of ETL concepts, you can embark on a rewarding career in this dynamic field. Good luck with your job search!

cta

Start Your Job Search Today

Browse through a variety of job opportunities tailored to your skills and preferences. Filter by location, experience, salary, and more to find your perfect fit.

Job Application AI Bot

Job Application AI Bot

Apply to 20+ Portals in one click

Download Now

Download the Mobile App

Instantly access job listings, apply easily, and track applications.

Featured Companies