Home
Jobs

10314 Etl Jobs - Page 35

Filter Interviews
Min: 0 years
Max: 25 years
Min: ₹0
Max: ₹10000000
Setup a job Alert
Filter
JobPe aggregates results for easy application access, but you actually apply on the job portal directly.

0.0 years

0 Lacs

Gurugram, Haryana

Remote

Indeed logo

Position: GCP Data Engineer Company Info: Prama (HQ : Chandler, AZ, USA) Prama specializes in AI-powered and Generative AI solutions for Data, Cloud, and APIs. We collaborate with businesses worldwide to develop platforms and AI-powered products that offer valuable insights and drive business growth. Our comprehensive services include architectural assessment, strategy development, and execution to create secure, reliable, and scalable systems. We are experts in creating innovative platforms for various industries. We help clients to overcome complex business challenges. Our team is dedicated to delivering cutting-edge solutions that elevate the digital experience for corporations. Prama is headquartered in Phoenix with offices in USA, Canada, Mexico, Brazil and India. Location: Bengaluru | Gurugram | Hybrid Benefits: 5 Day Working | Career Growth | Flexible working | Potential On-site Opportunity Kindly send your CV or Resume to careers@prama.ai Primary skills: GCP, PySpark, Python, SQL, ETL Job Description: We are seeking a highly skilled and motivated GCP Data Engineer to join our team. As a GCP Data Engineer, you will play a crucial role in designing, developing, and maintaining robust data pipelines and data warehousing solutions on the Google Cloud Platform (GCP). You will work closely with data analysts, data scientists, and other stakeholders to ensure the efficient collection, transformation, and analysis of large datasets. Responsibilities: · Design, develop, and maintain scalable data pipelines using GCP tools such as Dataflow, Dataproc, and Cloud Functions. · Implement ETL processes to extract, transform, and load data from various sources into BigQuery. · Optimize data pipelines for performance, cost-efficiency, and reliability. · Collaborate with data analysts and data scientists to understand their data needs and translate them into technical solutions. · Design and implement data warehouses and data marts using BigQuery. · Model and structure data for optimal performance and query efficiency. · Develop and maintain data quality checks and monitoring processes. · Use SQL and Python (PySpark) to analyze large datasets and generate insights. · Create visualizations using tools like Data Studio or Looker to communicate data findings effectively. · Manage and maintain GCP resources, including virtual machines, storage, and networking. · Implement best practices for security, cost optimization, and scalability. · Automate infrastructure provisioning and management using tools like Terraform. Qualifications: · Strong proficiency in SQL, Python, and PySpark. · Hands-on experience with GCP services, including BigQuery, Dataflow, Dataproc, Cloud Storage, and Cloud Functions. · Experience with data warehousing concepts and methodologies. · Understanding of data modeling techniques and best practices. · Strong analytical and problem-solving skills. · Excellent communication and collaboration skills. · Experience with data quality assurance and monitoring. · Knowledge of cloud security best practices. · A passion for data and a desire to learn new technologies. Preferred Qualifications: · Google Cloud Platform certification. · Experience with machine learning and AI. · Knowledge of data streaming technologies (Kafka, Pub/Sub). · Experience with data visualization tools (Looker, Tableau, Data Studio Job Type: Full-time Pay: ₹1,200,000.00 - ₹2,000,000.00 per year Benefits: Flexible schedule Health insurance Leave encashment Paid sick time Provident Fund Work from home Ability to commute/relocate: Gurugram, Haryana: Reliably commute or planning to relocate before starting work (Required) Application Question(s): CTC Expected CTC Notice Period (days) Experience in GCP Total Experience Work Location: Hybrid remote in Gurugram, Haryana

Posted 3 days ago

Apply

7.0 years

0 Lacs

Bengaluru, Karnataka, India

On-site

Linkedin logo

TEKsystems is seeking a Senior AWS + Data Engineer to join our dynamic team. The ideal candidate should have expertise Data engineer + Hadoop + Scala/Python with AWS services. This role involves designing, developing, and maintaining scalable and reliable software solutions. Job Title: Data Engineer – Spark/Scala (Batch Processing) Location: Manyata- Hybrid Experience: 7+yrs Type: Full-Time Mandatory Skills: 7-10 years’ experience in design, architecture or development in Analytics and Data Warehousing. Experience in building end-to-end solutions with the Big data platform, Spark or scala programming. 5 years of Solid experience in ETL pipeline building with spark or sclala programming framework with knowledge in developing UNIX Shell Script, Oracle SQL/ PL-SQL. Experience in Big data platform for ETL development with AWS cloud platform. Proficiency in AWS cloud services, specifically EC2, S3, Lambda, Athena, Kinesis, Redshift, Glue , EMR, DynamoDB, IAM, Secret Manager, Step functions, SQS, SNS, Cloud Watch. Excellent skills in Python-based framework development are mandatory. Should have experience with Oracle SQL database programming, SQL performance tuning, and relational model analysis. Extensive experience with Teradata data warehouses and Cloudera Hadoop. Proficient across Enterprise Analytics/BI/DW/ETL technologies such as Teradata Control Framework, Tableau, OBIEE, SAS, Apache Spark, Hive Analytics & BI Architecture appreciation and broad experience across all technology disciplines. Experience in working within a Data Delivery Life Cycle framework & Agile Methodology. Extensive experience in large enterprise environments handling large volume of datasets with High SLAs Good knowledge in developing UNIX scripts, Oracle SQL/PL-SQL, and Autosys JIL Scripts. Well versed in AI Powered Engineering tools like Cline, GitHub Copilo Please send the resumes to nvaseemuddin@teksystems.com or kebhat@teksystems.com Show more Show less

Posted 3 days ago

Apply

0.0 - 6.0 years

0 Lacs

Delhi, Delhi

Remote

Indeed logo

Apache Superset Data Engineer Experience : 3 - 6 years Bhubaneswar, Delhi - NCR, Remote Working About the Job Featured The Apache Superset Data Engineer plays a key role in designing, developing, and maintaining scalable data pipelines and analytics infrastructure, with a primary emphasis on data visualization and dashboarding using Apache Superset. This role sits at the intersection of data engineering and business intelligence, enabling stakeholders to access accurate, actionable insights through intuitive dashboards and reports. Core Responsibilities Create, customize, and maintain interactive dashboards in Apache Superset to support KPIs, experimentation, and business insights Work closely with analysts, BI teams, and business users to gather requirements and deliver effective Superset-based visualizations Perform data validation, feature engineering, and exploratory data analysis to ensure data accuracy and integrity Analyze A/B test results and deliver insights that inform business strategies Establish and maintain standards for statistical testing, data validation, and analytical workflows Integrate Superset with various database systems (e.g., MySQL, PostgreSQL) and manage associated drivers and connections Ensure Superset deployments are secure, scalable, and high-performing Clearly communicate findings and recommendations to both technical and non-technical stakeholders Required Skills Proven expertise in building dashboards and visualizations using Apache Superset Strong command of SQL and experience working with relational databases like MySQL, or PostgreSQL Proficiency in Python (or Java) for data manipulation and workflow automation Solid understanding of data modelling, ETL/ELT pipelines, and data warehousing principles Excellent problem-solving skills and a keen eye for data quality and detail Strong communication skills, with the ability to simplify complex technical concepts for non-technical audiences Nice to have familiarity with cloud platforms (AWS, ECS) Qualifications Bachelor’s degree in Computer Science, Engineering, or a related field 3+ yrs of relevant experience

Posted 3 days ago

Apply

5.0 years

0 Lacs

Noida, Uttar Pradesh

On-site

Indeed logo

At Cotality, we are driven by a single mission—to make the property industry faster, smarter, and more people-centric. Cotality is the trusted source for property intelligence, with unmatched precision, depth, breadth, and insights across the entire ecosystem. Our talented team of 5,000 employees globally uses our network, scale, connectivity and technology to drive the largest asset class in the world. Join us as we work toward our vision of fueling a thriving global property ecosystem and a more resilient society. Cotality is committed to cultivating a diverse and inclusive work culture that inspires innovation and bold thinking; it's a place where you can collaborate, feel valued, develop skills and directly impact the real estate economy. We know our people are our greatest asset. At Cotality, you can be yourself, lift people up and make an impact. By putting clients first and continuously innovating, we're working together to set the pace for unlocking new possibilities that better serve the property industry. Job Description: In India, we operate as Next Gear India Private Limited, a fully-owned subsidiary of Cotality with offices in Kolkata, West Bengal, and Noida, Uttar Pradesh. Next Gear India Private Limited plays a vital role in Cotality's Product Development capabilities, focusing on creating and delivering innovative solutions for the Property & Casualty (P&C) Insurance and Property Restoration industries. While Next Gear India Private Limited operates under its own registered name in India, we are seamlessly integrated into the Cotality family, sharing the same commitment to innovation, quality, and client success. When you join Next Gear India Private Limited, you become part of the global Cotality team. Together, we shape the future of property insights and analytics, contributing to a smarter and more resilient property ecosystem through cutting-edge technology and insights. Company Description At CoreLogic, we are driven by a single mission—to make the property industry faster, smarter, and more people-centric. CoreLogic is the trusted source for property intelligence, with unmatched precision, depth, breadth, and insights across the entire ecosystem. Our talented team of 5,000 employees globally uses our network, scale, connectivity, and technology to drive the largest asset class in the world. Join us as we work toward our vision of fueling a thriving global property ecosystem and a more resilient society. CoreLogic is committed to cultivating a diverse and inclusive work culture that inspires innovation and bold thinking; it's a place where you can collaborate, feel valued, develop skills, and directly impact the insurance marketplace. We know our people are our greatest asset. At CoreLogic, you can be yourself, lift people up and make an impact. By putting clients first and continuously innovating, we're working together to set the pace for unlocking new possibilities that better serve the property insurance and restoration industry. Description We are seeking a highly skilled Lead Data Analyst to join our Analytics team to serve customers across the property insurance and restoration industries. As a Lead Data Analyst you will play a crucial role in developing methods and models to inform data-driven decision processes resulting in improved business performance for both internal and external stakeholder groups. You will be responsible for interpreting complex data sets and providing valuable insights to enhance the value of data assets. The successful candidate will have a strong understanding of data mining techniques, methods of statistical analysis, and data visualization tools. This position offers an exciting opportunity to work in a dynamic environment, collaborating with cross-functional teams to support decision processes that will guide the respective industries into the future. Responsibilities Collaborate with cross-functional teams to understand and document requirements for analytics products. Serve as the primary point of contact for new data/analytics requests and support for customers. Lead a team of analysts to deliver client deliverables on a timely manner. Act as the domain expert and voice of the customer to internal stakeholders during the analytics development process. Develop and maintain an inventory of data, reporting, and analytic product deliverables for assigned customers. Work with customer success teams to establish and maintain appropriate customer expectations for analytics deliverables. Create and manage tickets on behalf of customers within internal frameworks. Ensure timely delivery of assets to customers and aid in the development of internal processes for the delivery of analytics deliverables. Work with IT/Infrastructure teams to provide customer access to assets and support internal audit processes to ensure data security. Create and optimize complex SQL queries for data extraction, transformation, and aggregation. Develop and maintain data models, dashboards, and reports to visualize data and track key performance metrics. Conduct validation checks and implement error handling mechanisms to ensure data reliability. Collaborate closely with stakeholders to align project goals with business needs and perform ad-hoc analysis to provide actionable recommendations. Analyze large and complex datasets to identify trends, patterns, and insights, and present findings and recommendations to stakeholders in a clear and concise manner Job Qualifications: 7+ years’ property insurance experience preferred 5+ years’ experience in management of mid-level professional teams or similar leadership position with a focus on data and/or performance management. Extensive experience in applying and/or developing performance management metrics for claims organizations. Bachelor’s degree in computer science, data science, statistics, or a related field is preferred. Mastery level knowledge of data analysis tools such as Excel, Tableau or Power BI. Demonstrated expertise in Power BI creating reports and dashboards, including the ability to connect to various data sources, prepare and model data, and create visualizations. Expert knowledge of DAX for creating calculated columns and measures to meet report-specific requirements. Expert knowledge of Power Query for importing, transforming, and shaping data. Proficiency in SQL with the ability to write complex queries and optimize performance. Strong knowledge of ETL processes, data pipeline and automation a plus. Proficiency in managing tasks with Jira is an advantage. Strong analytical and problem-solving skills. Excellent attention to detail and the ability to work with large datasets. Effective communication skills, both written and verbal. Excellent visual communications and storytelling with data skills. Ability to work independently and collaborate in a team environment. Cotality's Diversity Commitment: Cotality is fully committed to employing a diverse workforce and creating an inclusive work environment that embraces everyone’s unique contributions, experiences and values. We offer an empowered work environment that encourages creativity, initiative and professional growth and provides a competitive salary and benefits package. We are better together when we support and recognize our differences. Equal Opportunity Employer Statement: Cotality is an Equal Opportunity employer committed to attracting and retaining the best-qualified people available, without regard to race, ancestry, place of origin, colour, ethnic origin, citizenship, creed, sex, sexual orientation, record of offences, age, marital status, family status or disability. Cotality maintains a Drug-Free Workplace. Please apply on our website for consideration. Privacy Policy Global Applicant Privacy Policy By providing your telephone number, you agree to receive automated (SMS) text messages at that number from Cotality regarding all matters related to your application and, if you are hired, your employment and company business. Message & data rates may apply. You can opt out at any time by responding STOP or UNSUBSCRIBING and will automatically be opted out company-wide. Connect with us on social media! Click on the quicklinks below to find out more about our company and associates

Posted 3 days ago

Apply

0.0 - 5.0 years

0 Lacs

Noida, Uttar Pradesh

On-site

Indeed logo

Noida, Uttar Pradesh, India Business Intelligence BOLD is seeking for QA professional who will work directly with the BI Development team to validate Business Intelligence solutions. He will build test strategy and test plans and test cases for ETL and Business Intelligence components. He will also validate SQL queries related to test cases and produce test summary reports. Job Description ABOUT THIS TEAM BOLD Business Intelligence(BI) team is a centralized team responsible for managing all aspects of the organization's BI strategy, projects and systems. BI team enables business leaders to make data-driven decisions by providing reports and analysis. The team is responsible for developing and manage a latency-free credible enterprise data warehouse which is a data source for decision making and input to various functions of the organization like Product, Finance, Marketing, Customer Support etc. BI team has four sub-components as Data analysis, ETL, Data Visualization and QA. It manages deliveries through Snowflake, Sisense and Microstrategy as main infrastructure solutions. Other technologies including Python, R, Airflow are also used in ETL, QA and data visualizations. WHAT YOU’LL DO Work with Business Analysts, BI Developers to translate Business requirements into Test Cases Responsible for validating the data sources, extraction of data, applying transformation logic, and loading the data in the target tables. Designing, documenting and executing test plans, test harness, test scenarios/scripts & test cases for manual, automated & bug tracking tools. WHAT YOU’LL NEED Experience in Data Warehousing / BI Testing, using any ETL and Reporting Tool Extensive experience in writing and troubleshooting SQL Queries using any of the Databases – Snowflake/ Redshift / SQL Server / Oracle Exposure to Data Warehousing and Dimensional Modelling Concepts Experience in understanding of ETL Source to Target Mapping Document Experience in testing the code on any of the ETL Tools Experience in Validating the Dashboards / Reports on any of the Reporting tools – Sisense / Tableau / SAP Business Objects / MicroStrategy Hands-on experience and strong understanding of Software Development Life Cycle (SDLC) and Software Testing Life Cycle (STLC). Good experience of Quality Assurance methodologies like Waterfall, V-Model, Agile, Scrum. Well versed with writing detailed test cases for functional & non-functional requirements. Experience on different types of testing that includes Black Box testing, Smoke testing, Functional testing, System Integration testing, End-to-End Testing, Regression testing & User Acceptance testing (UAT) & Involved in Load Testing, Performance Testing & Stress Testing. · Expertise in using TFS / JIRA / Excel for writing the Test Cases and tracking the Exposure in scripting languages like Python to create automated Test Scripts or Automated tools like Query Surge will be an added advantage An effective communicator with strong analytical abilities combined with skills to plan, implement & presentation of projects EXPERIENCE- Senior QA Engineer, BI: 4.5 years+ #LI-SV1 Benefits Outstanding Compensation Competitive salary Tax-friendly compensation structure Bi-annual bonus Annual Appraisal Equity in company 100% Full Health Benefits Group Mediclaim, personal accident, & term life insurance Group Mediclaim benefit (including parents' coverage) Practo Plus health membership for employees and family Personal accident and term life insurance coverage Flexible Time Away 24 days paid leaves Declared fixed holidays Paternity and maternity leave Compassionate and marriage leave Covid leave (up to 7 days) Additional Benefits Internet and home office reimbursement In-office catered lunch, meals, and snacks Certification policy Cab pick-up and drop-off facility About BOLD We Transform Work Lives As an established global organization, BOLD helps people find jobs. Our story is one of growth, success, and professional fulfillment. We create digital products that have empowered millions of people in 180 countries to build stronger resumes, cover letters, and CVs. The result of our work helps people interview confidently, finding the right job in less time. Our employees are experts, learners, contributors, and creatives. We Celebrate And Promote Diversity And Inclusion We value our position as an Equal Opportunity Employer. We hire based on qualifications, merit, and our business needs. We don't discriminate regarding race, color, religion, gender, pregnancy, national origin or citizenship, ancestry, age, physical or mental disability, veteran status, sexual orientation, gender identity or expression, marital status, genetic information, or any other applicable characteristic protected by law.

Posted 3 days ago

Apply

7.0 years

0 Lacs

Gurugram, Haryana

On-site

Indeed logo

Engineer III, Database Engineering Gurgaon, India; Hyderabad, India Information Technology 316332 Job Description About The Role: Grade Level (for internal use): 10 Role: As a Senior Database Engineer, you will work on multiple datasets that will enable S&P CapitalIQ Pro to serve-up value-added Ratings, Research and related information to the Institutional clients. The Team: Our team is responsible for the gathering data from multiple sources spread across the globe using different mechanism (ETL/GG/SQL Rep/Informatica/Data Pipeline) and convert them to a common format which can be used by Client facing UI tools and other Data providing Applications. This application is the backbone of many of S&P applications and is critical to our client needs. You will get to work on wide range of technologies and tools like Oracle/SQL/.Net/Informatica/Kafka/Sonic. You will have the opportunity every day to work with people from a wide variety of backgrounds and will be able to develop a close team dynamic with coworkers from around the globe. We craft strategic implementations by using the broader capacity of the data and product. Do you want to be part of a team that executes cross-business solutions within S&P Global? Impact: Our Team is responsible to deliver essential and business critical data with applied intelligence to power the market of the future. This enables our customer to make decisions with conviction. Contribute significantly to the growth of the firm by- Developing innovative functionality in existing and new products Supporting and maintaining high revenue productionized products Achieve the above intelligently and economically using best practices Career: This is the place to hone your existing Database skills while having the chance to become exposed to fresh technologies. As an experienced member of the team, you will have the opportunity to mentor and coach developers who have recently graduated and collaborate with developers, business analysts and product managers who are experts in their domain. Your skills: You should be able to demonstrate that you have an outstanding knowledge and hands-on experience in the below areas: Complete SDLC: architecture, design, development and support of tech solutions Play a key role in the development team to build high-quality, high-performance, scalable code Engineer components, and common services based on standard corporate development models, languages and tools Produce technical design documents and conduct technical walkthroughs Collaborate effectively with technical and non-technical stakeholders Be part of a culture to continuously improve the technical design and code base Document and demonstrate solutions using technical design docs, diagrams and stubbed code Our Hiring Manager says: I’m looking for a person that gets excited about technology and motivated by seeing how our individual contribution and team work to the world class web products affect the workflow of thousands of clients resulting in revenue for the company. Qualifications Required: Bachelor’s degree in computer science, Information Systems or Engineering. 7+ years of experience on Transactional Databases like SQL server, Oracle, PostgreSQL and other NoSQL databases like Amazon DynamoDB, MongoDB Strong Database development skills on SQL Server, Oracle Strong knowledge of Database architecture, Data Modeling and Data warehouse. Knowledge on object-oriented design, and design patterns. Familiar with various design and architectural patterns Strong development experience with Microsoft SQL Server Experience in cloud native development and AWS is a big plus Experience with Kafka/Sonic Broker messaging systems Nice to have: Experience in developing data pipelines using Java or C# is a significant advantage. Strong knowledge around ETL Tools – Informatica, SSIS Exposure with Informatica is an advantage. Familiarity with Agile and Scrum models Working Knowledge of VSTS. Working knowledge of AWS cloud is an added advantage. Understanding of fundamental design principles for building a scalable system. Understanding of financial markets and asset classes like Equity, Commodity, Fixed Income, Options, Index/Benchmarks is desirable. Additionally, experience with Scala, Python and Spark applications is a plus. About S&P Global Market Intelligence At S&P Global Market Intelligence, a division of S&P Global we understand the importance of accurate, deep and insightful information. Our team of experts delivers unrivaled insights and leading data and technology solutions, partnering with customers to expand their perspective, operate with confidence, and make decisions with conviction. For more information, visit www.spglobal.com/marketintelligence. What’s In It For You? Our Purpose: Progress is not a self-starter. It requires a catalyst to be set in motion. Information, imagination, people, technology–the right combination can unlock possibility and change the world. Our world is in transition and getting more complex by the day. We push past expected observations and seek out new levels of understanding so that we can help companies, governments and individuals make an impact on tomorrow. At S&P Global we transform data into Essential Intelligence®, pinpointing risks and opening possibilities. We Accelerate Progress. Our People: We're more than 35,000 strong worldwide—so we're able to understand nuances while having a broad perspective. Our team is driven by curiosity and a shared belief that Essential Intelligence can help build a more prosperous future for us all. From finding new ways to measure sustainability to analyzing energy transition across the supply chain to building workflow solutions that make it easy to tap into insight and apply it. We are changing the way people see things and empowering them to make an impact on the world we live in. We’re committed to a more equitable future and to helping our customers find new, sustainable ways of doing business. We’re constantly seeking new solutions that have progress in mind. Join us and help create the critical insights that truly make a difference. Our Values: Integrity, Discovery, Partnership At S&P Global, we focus on Powering Global Markets. Throughout our history, the world's leading organizations have relied on us for the Essential Intelligence they need to make confident decisions about the road ahead. We start with a foundation of integrity in all we do, bring a spirit of discovery to our work, and collaborate in close partnership with each other and our customers to achieve shared goals. Benefits: We take care of you, so you can take care of business. We care about our people. That’s why we provide everything you—and your career—need to thrive at S&P Global. Our benefits include: Health & Wellness: Health care coverage designed for the mind and body. Flexible Downtime: Generous time off helps keep you energized for your time on. Continuous Learning: Access a wealth of resources to grow your career and learn valuable new skills. Invest in Your Future: Secure your financial future through competitive pay, retirement planning, a continuing education program with a company-matched student loan contribution, and financial wellness programs. Family Friendly Perks: It’s not just about you. S&P Global has perks for your partners and little ones, too, with some best-in class benefits for families. Beyond the Basics: From retail discounts to referral incentive awards—small perks can make a big difference. For more information on benefits by country visit: https://spgbenefits.com/benefit-summaries Global Hiring and Opportunity at S&P Global: At S&P Global, we are committed to fostering a connected and engaged workplace where all individuals have access to opportunities based on their skills, experience, and contributions. Our hiring practices emphasize fairness, transparency, and merit, ensuring that we attract and retain top talent. By valuing different perspectives and promoting a culture of respect and collaboration, we drive innovation and power global markets. Recruitment Fraud Alert: If you receive an email from a spglobalind.com domain or any other regionally based domains, it is a scam and should be reported to reportfraud@spglobal.com. S&P Global never requires any candidate to pay money for job applications, interviews, offer letters, “pre-employment training” or for equipment/delivery of equipment. Stay informed and protect yourself from recruitment fraud by reviewing our guidelines, fraudulent domains, and how to report suspicious activity here. - Equal Opportunity Employer S&P Global is an equal opportunity employer and all qualified candidates will receive consideration for employment without regard to race/ethnicity, color, religion, sex, sexual orientation, gender identity, national origin, age, disability, marital status, military veteran status, unemployment status, or any other status protected by law. Only electronic job submissions will be considered for employment. If you need an accommodation during the application process due to a disability, please send an email to: EEO.Compliance@spglobal.com and your request will be forwarded to the appropriate person. US Candidates Only: The EEO is the Law Poster http://www.dol.gov/ofccp/regs/compliance/posters/pdf/eeopost.pdf describes discrimination protections under federal law. Pay Transparency Nondiscrimination Provision - https://www.dol.gov/sites/dolgov/files/ofccp/pdf/pay-transp_%20English_formattedESQA508c.pdf - 20 - Professional (EEO-2 Job Categories-United States of America), IFTECH202.1 - Middle Professional Tier I (EEO Job Group) Job ID: 316332 Posted On: 2025-06-16 Location: Gurgaon, Haryana, India

Posted 3 days ago

Apply

8.0 years

0 Lacs

Bengaluru, Karnataka

Remote

Indeed logo

Bangalore,Karnataka,India Job ID 763290 Join our Team About this opportunity: Ericsson’s Automation Chapter is seeking a highly motivated and self-driven Data Engineer and Senior Data Engineer with strong expertise in SAP HANA and SAP BODS. The ideal candidates will be focused on SAP-centric development and integration, ensuring that enterprise data flows are robust, scalable, and optimized for analytics consumption. You will collaborate with a high-performing team that builds and supports end-to-end data solutions aligned with our SAP ecosystem. You are adaptable and a flexible problem-solver with deep hands-on experience in HANA modeling and ETL workflows, capable of switching contexts across a range of projects with varying scale and complexity. What you bring : Design, develop, and optimize SAP HANA objects such as Calculation Views, SQL Procedures, and Custom Functions. Develop robust and reusable ETL pipelines using SAP BODS for both SAP and third-party system integration. Enable seamless data flow between SAP ECC and external platforms, ensuring accuracy and performance. Collaborate with business analysts, architects, and integration specialists to translate requirements into technical deliverables. Tune and troubleshoot HANA and BODS jobs for performance, scalability, and maintainability. Ensure compliance with enterprise data governance, lineage, and documentation standards. Support ongoing enhancements, production issues, and business-critical data deliveries. Experience 8+ years of experience in SAP data engineering roles. Strong hands-on experience in SAP HANA (native development, modeling, SQL scripting). Proficient in SAP BODS, including job development, data flows, and integration techniques. Experience working with SAP ECC data structures, IDOCs, and remote function calls. Knowledge of data warehouse concepts, data modeling, and performance optimization techniques. Strong debugging and analytical skills, with the ability to independently drive technical solutions. Familiarity with version control tools and SDLC processes. Excellent communication skills and ability to work collaboratively in cross-functional teams. Education Bachelor’s degree in computer science, Information Systems, Electronics & Communication, or a related field. Why join Ericsson? At Ericsson, you´ll have an outstanding opportunity. The chance to use your skills and imagination to push the boundaries of what´s possible. To build solutions never seen before to some of the world’s toughest problems. You´ll be challenged, but you won’t be alone. You´ll be joining a team of diverse innovators, all driven to go beyond the status quo to craft what comes next. What happens once you apply?

Posted 3 days ago

Apply

3.0 years

0 Lacs

Bengaluru, Karnataka

On-site

Indeed logo

About us At ExxonMobil, our vision is to lead in energy innovations that advance modern living and a net-zero future. As one of the world’s largest publicly traded energy and chemical companies, we are powered by a unique and diverse workforce fueled by the pride in what we do and what we stand for. The success of our Upstream, Product Solutions and Low Carbon Solutions businesses is the result of the talent, curiosity and drive of our people. They bring solutions every day to optimize our strategy in energy, chemicals, lubricants and lower-emissions technologies. We invite you to bring your ideas to ExxonMobil to help create sustainable solutions that improve quality of life and meet society’s evolving needs. Learn more about our What and our Why and how we can work together . ExxonMobil’s affiliates in India ExxonMobil’s affiliates have offices in India in Bengaluru, Mumbai and the National Capital Region. ExxonMobil’s affiliates in India supporting the Product Solutions business engage in the marketing, sales and distribution of performance as well as specialty products across chemicals and lubricants businesses. The India planning teams are also embedded with global business units for business planning and analytics. ExxonMobil’s LNG affiliate in India supporting the upstream business provides consultant services for other ExxonMobil upstream affiliates and conducts LNG market-development activities. The Global Business Center - Technology Center provides a range of technical and business support services for ExxonMobil’s operations around the globe. ExxonMobil strives to make a positive contribution to the communities where we operate and its affiliates support a range of education, health and community-building programs in India. Read more about our Corporate Responsibility Framework. To know more about ExxonMobil in India, visit ExxonMobil India and the Energy Factor India. What role you will play in our team The Foundry Application Developer will leverage their expertise in both low-code and pro-code application development, with a strong emphasis on data engineering and ETL ecosystems, particularly within Palantir Foundry. This role is pivotal in developing and implementing use cases that utilize the Foundry Ontology (data layer) to drive business insights and operational efficiencies Job location is based out of Bengaluru, Karnataka What you will do Design, develop, and maintain robust data pipelines and ETL processes using Palantir Foundry to ensure seamless data integration and transformation. Work closely with Ontology developers to create and enhance Ontology objects that support various use case applications, ensuring data consistency and accessibility. Develop use case applications primarily using Foundry Workshop (low-code environment) and, when necessary, employ the Foundry SDK (pro-code) to meet complex requirements. Continuously monitor and optimize data workflows to improve performance, scalability, and reliability of data processes. Adhere to and promote best practices in data engineering, including code quality, version control, and documentation. About You Skills and Qualifications Bachelor’s or master’s degree from a recognized university in Computer/IT other relevant engineering disciplines with minimum GPA 7.0 Minimum 5 Years of overall IT experience working with data Minimum 3 years on palantir foundry strong experience on palantir foundry platform, SQL, PySpark and data warehouse. Experience working with modern web development languages (e.g., JavaScript, Typescript, React) Experience on Ontology design Implementation experience on building data pipelines using palantir foundry in automating the ETL processes Experience building applications using Foundry Application-development tool stack Well versed with migration & deployment process on palantir platform Experience on implementing interactive dashboards and visualizations in palantir foundry Knowledge of Git version control best practices Understanding of databases data warehouse and data modeling Excellent teamwork and communication skills, with the ability to work effectively in a collaborative environment. Preferred Qualifications/ Experience Strong analytical and problem-solving skills with an ability to learn quickly and continuously Demonstrated ability to analyze complex data problems and develop innovative solutions. Ability to adapt to new technologies and methodologies in a rapidly evolving space Experience with Agile practices and working in a SCRUM team Any prior working experience in Oil & Gas sector Your benefits An ExxonMobil career is one designed to last. Our commitment to you runs deep: our employees grow personally and professionally, with benefits built on our core categories of health, security, finance and life. We offer you: Competitive compensation Medical plans, maternity leave and benefits, life, accidental death and dismemberment benefits Retirement benefits Global networking & cross-functional opportunities Annual vacations & holidays Day care assistance program Training and development program Tuition assistance program Workplace flexibility policy Relocation program Transportation facility Please note benefits may change from time to time without notice, subject to applicable laws. The benefits programs are based on the Company’s eligibility guidelines. Stay connected with us Learn more about ExxonMobil in India, visit ExxonMobil India and Energy Factor India . Follow us on LinkedIn and Instagram Like us on Facebook Subscribe our channel at YouTube EEO Statement ExxonMobil is an Equal Opportunity Employer: All qualified applicants will receive consideration for employment without regard to race, color, religion, sex, age, national origin or disability status. Business solicitation and recruiting scams ExxonMobil does not use recruiting or placement agencies that charge candidates an advance fee of any kind (e.g., placement fees, immigration processing fees, etc.). Follow the LINK to understand more about recruitment scams in the name of ExxonMobil. Nothing herein is intended to override the corporate separateness of local entities. Working relationships discussed herein do not necessarily represent a reporting connection, but may reflect a functional guidance, stewardship, or service relationship. Exxon Mobil Corporation has numerous affiliates, many with names that include ExxonMobil, Exxon, Esso and Mobil. For convenience and simplicity, those terms and terms like corporation, company, our, we and its are sometimes used as abbreviated references to specific affiliates or affiliate groups. Abbreviated references describing global or regional operational organizations and global or regional business lines are also sometimes used for convenience and simplicity. Similarly, ExxonMobil has business relationships with thousands of customers, suppliers, governments, and others. For convenience and simplicity, words like venture, joint venture, partnership, co-venturer, and partner are used to indicate business relationships involving common activities and interests, and those words may not indicate precise legal relationships. Nothing herein is intended to override the corporate separateness of local entities. Working relationships discussed herein do not necessarily represent a reporting connection, but may reflect a functional guidance, stewardship, or service relationship. Exxon Mobil Corporation has numerous affiliates, many with names that include ExxonMobil, Exxon, Esso and Mobil. For convenience and simplicity, those terms and terms like corporation, company, our, we and its are sometimes used as abbreviated references to specific affiliates or affiliate groups. Abbreviated references describing global or regional operational organizations and global or regional business lines are also sometimes used for convenience and simplicity. Similarly, ExxonMobil has business relationships with thousands of customers, suppliers, governments, and others. For convenience and simplicity, words like venture, joint venture, partnership, co-venturer, and partner are used to indicate business relationships involving common activities and interests, and those words may not indicate precise legal relationships.

Posted 3 days ago

Apply

8.0 years

0 Lacs

Hyderabad, Telangana, India

On-site

Linkedin logo

Location: Hyderabad Contract Duration: 6 Months Experience Required: 8+ years (Overall), 5+ years (Relevant) 🔧 Primary Skills Python Spark (PySpark) SQL Delta Lake 📌 Key Responsibilities & Skills Strong understanding of Spark core: RDDs, DataFrames, DataSets, SparkSQL, Spark Streaming Proficient in Delta Lake features: time travel, schema evolution, data partitioning Experience designing and building data pipelines using Spark and Delta Lake Solid experience in Python/Scala/Java for Spark development Knowledge of data ingestion from files, APIs, and databases Familiarity with data validation and quality best practices Working knowledge of data warehouse concepts and data modeling Hands-on with Git for code versioning Exposure to CI/CD pipelines and containerization tools Nice to have: experience in ETL tools like DataStage, Prophecy, Informatica, or Ab Initio Show more Show less

Posted 3 days ago

Apply

2.0 years

0 Lacs

Hyderabad, Telangana, India

On-site

Linkedin logo

Description Amazon's Product Lifecycle Support (PLS) offers relevant post-purchase product support to customers, and empowers them to make the most of the products purchased on Amazon. By solving post-purchase product issues, we prevent avoidable returns and help the planet by extending the life of products, thereby generating positive financial and environmental impacts. PLS first launched in Q4 2015 featuring a warranty repair option for Samsung laptops surfaced during the returns process in the Online Return Center (ORC). Since then, PLS has grown substantially, and now offers customers eight product support options - (1) Live call & chat with Amazon product support agent available up to 6 months, (2) Live call & chat with the brand product support agent available up to 2 years, (3) contact the manufacturer yourself by accessing brand phone number and/ support website available up to 2 years, (4) access free warranty repair services available up to 2 years, (5) accessing free replacement parts available until return window, (6) access other sustainable end-of-life options such as trade-in, resell, refill, recycle, donate etc. available up to 2 years, (7) help yourself by watching step-by-step video instructions provided by the brand, and (8) help yourself by following step-by-step instructions provided by the brand. We are seeking a motivated Account Manager to drive brand adoption and expansion of Amazon’s Product Support (PLS) program across multiple marketplaces, across North America and Europe. This role will play a crucial part in helping brands enroll and optimize their product support offerings—spanning setup, troubleshooting, warranty services, replacement parts, trade-ins, and recycling solutions—to improve the customer experience, reduce returns, and enhance product sustainability. The Account Manager will work closely with the Customer Insights Program (CIP) lead to prioritize target brands and products. With guidance from their local manager, they will partner with the Selling Partner Program (SIP) counter parts, Enrollment & Operations Program (EOP) counterparts, and collaborate with internal stakeholders team counterparts (AMs, VMs, CSMs, Sales Reps, and Marketing Teams) to execute outreach campaigns, drive enrollment, and increase coverage and quality of PLS. Key job responsibilities Brand Prioritization & Targeting Leverage insights from the CIP program to identify high-priority brands and products for PLS expansion. Analyze return trends, defect drivers, and customer engagement metrics to develop targeted outreach plans. Brand Engagement & Awareness Partner with the SIP program lead to execute multi-channel brand engagement strategies, including email campaigns, webinars, training sessions, and one-on-one consultations. Educate brands on how to enroll and manage product support on Seller Central (SC) and Vendor Central (VC) using self-serve tools. Highlight the benefits of Amazon’s AI assistant (Rufus), Get Product Support (GPS) button, and performance reporting in enhancing product support. Work closely with brands to co-develop sustainability solutions for extended product support beyond two years. Stakeholder Collaboration Build strong relationships with AMs, VMs, CSMs, and Sales Teams to align PLS messaging across all touchpoints. Collaborate with internal teams to ensure smooth onboarding and support for brands needing deeper integration. Performance Tracking & Reporting Monitor PLS enrollment trends and adoption rates across brands and marketplaces. Gather and synthesize Voice of Seller (VOS) insights to inform feature enhancements and drive continuous improvement. Provide regular reporting on brand engagement impact and adoption rates. Basic Qualifications 1+ years of program or project management experience Knowledge of SQL and Advanced Excel (Array and Statistical formulas) Experience using data to influence business decisions Preferred Qualifications Knowledge of analytics & statistical tools such as SAS, PowerBI, SQL & ETL DW concepts Knowledge of visualization tools such as Tableau, Datazen, SSRS Experience back office operations, escalation management and troubleshooting environments Our inclusive culture empowers Amazonians to deliver the best results for our customers. If you have a disability and need a workplace accommodation or adjustment during the application and hiring process, including support for the interview or onboarding process, please visit https://amazon.jobs/content/en/how-we-hire/accommodations for more information. If the country/region you’re applying in isn’t listed, please contact your Recruiting Partner. Company - Amazon Dev Center India - Hyderabad - A85 Job ID: A2994020 Show more Show less

Posted 3 days ago

Apply

4.0 years

0 Lacs

Gurugram, Haryana, India

On-site

Linkedin logo

ABOUT US: Bain & Company is a global consultancy that helps the world’s most ambitious change makers define the future. Across 61 offices in 39 countries, we work alongside our clients as one team with a shared ambition to achieve extraordinary results, outperform the competition and redefine industries. Since our founding in 1973, we have measured our success by the success of our clients, and we proudly maintain the highest level of client advocacy in the industry. In 2004, the firm established its presence in the Indian market by opening the Bain Capability Center (BCC) in New Delhi. The BCC is now known as BCN (Bain Capability Network) with its nodes across various geographies. BCN plays a critical role in supporting Bain's case teams globally to help with analytics and research across all industries, for corporate cases, client development, private equity diligence or Bain intellectual property. The BCN comprises of Consulting Services, Knowledge Services and Shared Services. WHO YOU’LL WORK WITH: This role is based out of the People & ORG CoE which sits in the broader Data & Tech cluster at the BCN. People & ORG CoE works on building and deploying analytical solutions pertaining to Operating Model and Organization Practice. The team primarily helps Bain case teams, across geographies and industries, solve critical client issues by applying battle-proven diagnostics/ solutions that can identify client pain points related to org, culture, and talent. The team also plays a significant role in creating, testing, and contributing to the proprietary products and Bain IP within the Org domain. This role will focus on development, maintenance and evolution of the state-of-the art org tool and data assets. WHAT YOU’LL DO: Become an expert and own, maintain and evolve advanced internal tools (Python focused) as well as help develop new tools with LLMs and GenAI, in individual capacity Be responsible for end-to-end handling of the entire tool process, i.e., developing Python scripts, troubleshooting errors, etc. Help with case delivery related to those tools and generate meaningful insights for Bain clients Potentially build and maintain internal web applications using front-end technologies (HTML, CSS, JavaScript) and frameworks like Streamlit; ensure compatibility across devices and browsers Work under the guidance of a Team Manager / Sr. Team Manager, playing a key role in driving the team’s innovation, especially on GenAI topics – identifying areas for automation and augmentation, helping team create efficiency gains Lead internal team calls and effectively communicate data, knowledge, insights and actionable next steps on tool development, relaying implications to own internal team where necessary Keep abreast of new and current statistical, database, machine learning, and advanced analytics techniques ABOUT YOU: Candidate should be a Graduate/ Post-graduate from top-tier college with strong academic background Must-have relevant experience of 4+ years on Python, with experience using/ building tools using GenAI, LLMs or Machine Learning will be preferred Advanced understanding of database design and Azure/ AWS servers functioning would be preferred Good to have experience in SQL, Git, and hands-on experience with statistical and machine learning models (e.g., regression, classification, clustering, NLP, ensemble methods), including practical application in business contexts Good to have experience of HTML, CSS, JavaScript (ES6+), pgadmin and low-code development tools such as Streamlit, Mendix, Power Apps Experience with data science/ data analytics and ETL tools such as Alteryx, Informatica, will be a plus Must be able to generate and screen realistic answers based on sound reality checks and recommend actionable solutions Must be willing to own and maintain high visibility and high impact product Experience in managing productized solutions will be helpful Excellent oral and written communication skills including the ability to communicate effectively with both technical and non-technical senior stakeholders Ability to prioritize projects, manage multiple competing priorities and drive projects to completion under tight deadlines WHAT MAKES US A GREAT PLACE TO WORK: We are proud to be consistently recognized as one of the world's best places to work, a champion of diversity and a model of social responsibility. We are currently ranked the #1 consulting firm on Glassdoor’s Best Places to Work list, and we have maintained a spot in the top four on Glassdoor's list for the last 12 years. We believe that diversity, inclusion and collaboration is key to building extraordinary teams. We hire people with exceptional talents, abilities and potential, then create an environment where you can become the best version of yourself and thrive both professionally and personally. We are publicly recognized by external parties such as Fortune, Vault, Mogul, Working Mother, Glassdoor and the Human Rights Campaign for being a great place to work for diversity and inclusion, women, LGBTQ and parents. Show more Show less

Posted 3 days ago

Apply

5.0 years

0 Lacs

Chennai, Tamil Nadu, India

On-site

Linkedin logo

Project Role : Application Lead Project Role Description : Lead the effort to design, build and configure applications, acting as the primary point of contact. Must have skills : SAP BW on HANA Data Modeling & Development Good to have skills : NA Minimum 5 Year(s) Of Experience Is Required Educational Qualification : 15 years full time education Summary: As an Application Lead, you will lead the effort to design, build, and configure applications, acting as the primary point of contact. Your typical day will involve collaborating with various teams to ensure that application requirements are met, overseeing the development process, and providing guidance to team members. You will also engage in problem-solving activities, ensuring that the applications are aligned with business objectives and user needs, while maintaining a focus on quality and efficiency throughout the project lifecycle. Roles & Responsibilities: - Expected to be an SME. - Collaborate and manage the team to perform. - Responsible for team decisions. - Engage with multiple teams and contribute on key decisions. - Provide solutions to problems for their immediate team and across multiple teams. - Facilitate knowledge sharing sessions to enhance team capabilities. - Monitor project progress and ensure timely delivery of milestones. Professional & Technical Skills: - Experience on SAP ABAP on HANA - Experience on AMDP - Must To Have Skills: Proficiency in SAP BW on HANA Data Modeling & Development. - Strong understanding of data warehousing concepts and best practices. - Experience with ETL processes and data integration techniques. - Familiarity with reporting tools and dashboard creation. - Ability to troubleshoot and optimize data models for performance. Additional Information: - The candidate should have minimum 5 years of experience in SAP BW on HANA Data Modeling & Development. - This position is based in Chennai. - A 15 years full time education is required. 15 years full time education Show more Show less

Posted 3 days ago

Apply

3.0 years

0 Lacs

Hyderabad, Telangana, India

On-site

Linkedin logo

Project Role : Application Developer Project Role Description : Design, build and configure applications to meet business process and application requirements. Must have skills : Databricks Unified Data Analytics Platform, Microsoft Azure Databricks, PySpark Good to have skills : NA Minimum 3 Year(s) Of Experience Is Required Educational Qualification : 15 years full time education Summary: As an Application Developer, you will design, build, and configure applications to meet business process and application requirements. A typical day involves collaborating with team members to understand project needs, developing application features, and ensuring that the applications are aligned with business objectives. You will also engage in testing and troubleshooting to enhance application performance and user experience, while continuously seeking opportunities for improvement and innovation in application development. Roles & Responsibilities: - Expected to perform independently and become an SME. - Required active participation/contribution in team discussions. - Contribute in providing solutions to work related problems. - Assist in the documentation of application processes and workflows. - Engage in code reviews to ensure quality and adherence to best practices. Professional & Technical Skills: - Must To Have Skills: Proficiency in Databricks Unified Data Analytics Platform, Microsoft Azure Databricks, PySpark. - Strong understanding of data integration techniques and ETL processes. - Experience with cloud-based application development and deployment. - Familiarity with agile development methodologies and practices. - Ability to troubleshoot and optimize application performance. Additional Information: - The candidate should have minimum 3 years of experience in Databricks Unified Data Analytics Platform. - This position is based at our Hyderabad office. - A 15 years full time education is required. 15 years full time education Show more Show less

Posted 3 days ago

Apply

3.0 years

0 Lacs

Hyderabad, Telangana, India

On-site

Linkedin logo

Project Role : Application Developer Project Role Description : Design, build and configure applications to meet business process and application requirements. Must have skills : Microsoft Azure Data Services Good to have skills : NA Minimum 3 Year(s) Of Experience Is Required Educational Qualification : 15 years full time education Summary: As an Application Developer, you will design, build, and configure applications to meet business process and application requirements. A typical day involves collaborating with team members to understand project needs, developing innovative solutions, and ensuring that applications are optimized for performance and usability. You will engage in problem-solving discussions and contribute to the overall success of the projects you are involved in, ensuring that the applications align with business objectives and user expectations. Roles & Responsibilities: - Expected to perform independently and become an SME. - Required active participation/contribution in team discussions. - Contribute in providing solutions to work related problems. - Assist in the documentation of application specifications and user guides. - Engage in continuous learning to stay updated with the latest technologies and best practices. Professional & Technical Skills: - Must To Have Skills: Proficiency in Microsoft Azure Data Services. - Good To Have Skills: Experience with Azure DevOps and CI/CD pipelines. - Strong understanding of cloud computing concepts and architecture. - Experience in developing and deploying applications on Azure. - Familiarity with data integration and ETL processes. Additional Information: - The candidate should have minimum 3 years of experience in Microsoft Azure Data Services. - This position is based at our Hyderabad office. - A 15 years full time education is required. 15 years full time education Show more Show less

Posted 3 days ago

Apply

6.0 years

0 Lacs

India

Remote

Linkedin logo

Job Title : Business Analyst - Capital Markets Location : Remote Years Of Experience : 6+ Years Mandatory Qualifications: SAFe/Agile Certification Job Description: We are looking for a highly driven Business Analyst with strong Capital Markets expertise to join our growing team. In this role, you’ll play a key part in analyzing, documenting, and validating business and functional requirements , while supporting system integration and user acceptance testing. The ideal candidate has a solid background in financial services or capital markets , a sharp analytical mindset, and the ability to collaborate with stakeholders across business and technology functions. Responsibilities: Engage with business stakeholders to understand, document, and validate business needs . Build stakeholder alignment and visualize proposed solutions through wireframes, workflows, or prototypes. Develop detailed business cases, requirements, user stories, test plans, and operational processes . Create and maintain process flows, workflows, and use case documentation using tools like Visio. Lead or contribute to the design and execution of test plans and test cases during SIT and UAT phases. Identify whether business needs can be met with existing solutions or require new design/technology. Collaborate closely with development and QA teams to ensure requirements are correctly implemented. Support training material preparation and conduct stakeholder training where needed. Maintain clear traceability between business requirements and delivered functionality. Actively support project change management , issue tracking, and continuous improvement activities. Prepare professional documentation and presentations for stakeholders and leadership . Requirements: Bachelor’s degree in Computer Science, Information Systems, Finance , or a related field. 6+ years of experience as a Business Analyst, ideally within a Capital Markets or large financial institution. Hands-on experience with: Capital markets products , trade life cycle, and risk management concepts. Developing functional specifications, test plans, and business cases . SDLC methodologies including Agile, Waterfall, or hybrid models. SQL (basic proficiency in SELECT queries for data analysis). MS Excel , PowerPoint , and Visio for documentation and analysis. Familiarity with relational database concepts and data modeling. Strong communication skills and ability to translate complex business processes into technical requirements. Ability to work independently and manage priorities in a dynamic, fast-paced environment. Preference Experience in cloud transformation or large-scale technology modernization projects. Familiarity with tools such as JIRA , Confluence , or Azure DevOps . Exposure to data governance , data lineage , or ETL/data warehouse projects in a capital markets context. Show more Show less

Posted 3 days ago

Apply

10.0 years

0 Lacs

Pune, Maharashtra, India

On-site

Linkedin logo

We use cookies to offer you the best possible website experience. Your cookie preferences will be stored in your browser’s local storage. This includes cookies necessary for the website's operation. Additionally, you can freely decide and change any time whether you accept cookies or choose to opt out of cookies to improve website's performance, as well as cookies used to display content tailored to your interests. Your experience of the site and the services we are able to offer may be impacted if you do not accept all cookies. Press Tab to Move to Skip to Content Link Skip to main content Home Page Home Page Life At YASH Core Values Careers Business Consulting Jobs Digital Jobs ERP IT Infrastructure Jobs Sales & Marketing Jobs Software Development Jobs Solution Architects Jobs Join Our Talent Community Social Media LinkedIn Twitter Instagram Facebook Search by Keyword Search by Location Home Page Home Page Life At YASH Core Values Careers Business Consulting Jobs Digital Jobs ERP IT Infrastructure Jobs Sales & Marketing Jobs Software Development Jobs Solution Architects Jobs Join Our Talent Community Social Media LinkedIn Twitter Instagram Facebook View Profile Employee Login Search by Keyword Search by Location Show More Options Loading... Requisition ID All Skills All Select How Often (in Days) To Receive An Alert: Create Alert Select How Often (in Days) To Receive An Alert: Apply now » Apply Now Start apply with LinkedIn Please wait... Technical Project Manager - Data Engineering Job Date: Jun 15, 2025 Job Requisition Id: 61626 Location: Pune, IN YASH Technologies is a leading technology integrator specializing in helping clients reimagine operating models, enhance competitiveness, optimize costs, foster exceptional stakeholder experiences, and drive business transformation. At YASH, we’re a cluster of the brightest stars working with cutting-edge technologies. Our purpose is anchored in a single truth – bringing real positive changes in an increasingly virtual world and it drives us beyond generational gaps and disruptions of the future. We are looking forward to hire Data Engineering Professionals in the following areas : Experience 10-12 Years Job Description Location: Pune Job Summary: We are seeking a detail-oriented and technically proficient Technical Project Manager (TPM) with a strong background in data engineering, analytics, or data science. The TPM will be responsible for leading cross-functional teams to deliver data-centric projects on time, within scope, and within budget. This role bridges the gap between business needs and technical execution, ensuring alignment across stakeholders. Key Responsibilities: Lead end-to-end project management for data and engineering initiatives, including planning, execution, and delivery. Lead the planning, execution, and delivery of data-related projects (e.g., data platform migrations, analytics implementations, ML model deployments). Collaborate with data engineers, analysts, and business stakeholders to define project scope, goals, and deliverables. Develop detailed project plans, timelines, and resource allocations. Manage project risks, issues, and changes to ensure successful delivery. Ensure data quality, governance, and compliance standards are met. Facilitate communication across technical and non-technical teams. Track project performance using appropriate tools and techniques. Conduct post-project evaluations and implement lessons learned. Required Qualifications: Bachelor’s or Master’s degree in Computer Science, Data Science, Information Systems, or a related field. 5+ years of experience in project management, with at least 2 years managing data-focused projects. Strong understanding of data pipelines, ETL processes, cloud platforms (e.g., AWS, Azure), and data governance. Proficiency with project management tools (e.g., Jira, MS Project). Excellent communication, leadership, and stakeholder management skills. Familiarity with BI tools (e.g., Power BI, Tableau). PMP or Agile/Scrum certification is a plus. Required Technical/ Functional Competencies Change Management: Specialized in overcoming resistance to change and helping organizations achieve their Agile goals. Able to guide teams in driving the change management projects or requirements. Customer Management: Specialized knowledge of customers' business domain and technology suite. Use latest technology, communicate effectively, demonstrate leadership, present technical offerings, and proactively suggest solutions. Delivery Management: Specialized knowledge of deal modeling, commercial and pricing models. Create an integrated pricing model across service lines. Guide team members to apply pricing techniques. Grow the account, forecast revenues and analyze complex internal reports. Manage at least 1 complex account (>10m) or multiple small account independently. Domain/ Industry Knowledge: Specialized knowledge of customers' business processes and relevant technology platform or product. Able to forecast business requirements and market trends, manage project issues, and validate customer strategy roadmap. Product/Technology Knowledge: In-depth knowledge of platform/product and associated technologies. Review various product-specific solutions for a specific project/client/organization and conduct product demos, walkthroughs and presentations to prospects if required. Profitability Management: Demonstrate competence in applying profitability and cost management techniques. Can develop Project budgets, monitor actual costs against the budget, and identify potential cost overruns or deviations. Use established processes and tools to track and control project expenses. Project Management: Extensive experience in managing projects and can handle complex projects with minimal supervision. Deep understanding of project management concepts and methodologies and can apply them effectively to achieve project goals. Scheduling And Resource Planning: Prepare independent global delivery models covering skill levels, skill mix and onsite/offshore work allocation. Create an accurate resource plan for people, space and infrastructure for the given requirements. Forecast people and skill requirements to align with plans. Optimize the schedule for complex projects. Service Support And Maintenance: Plan and execute transition for large/ complex activities. Define standards in transition management based on industry trends and contribute to building tools and accelerators for KT process. Optimize resource utilization based on demand from customers. Select and define SLAs; track service levels and analyze impact of SLA on complex processes and deliverables. Risk Management: Good understanding of risk management principles and techniques. Identify, assess, and document risks independently, as well as prioritize risks based on their potential impact. Assist in developing risk mitigation plans and monitoring risk responses. Required Behavioral Competencies Accountability: Being a role model for taking initiative and ensuring others take initiative, removing obstacles for others, taking ownership for results and deadlines for self and others, and acting as a role model for being responsible. Agility: Works with a diverse set of situations, people and groups and adapts and motivates self and team to thrive in changing environment. Collaboration: Reaches out to others in team to ensure connections are made and team members are working together. Looks for ways to integrate work with other teams, identifying similarities and opportunities, making necessary changes in work to ensure successful integration. Customer Focus: Engages in executive customer discovery to predict future needs of customers and drives customer relationships with a long-term focus and takes actions to enhance customer loyalty. Communication: Communicates and presents complex ideas, information, and data to multiple, broad, and demanding stakeholders internal and/or external to the Organization. Helps others communicate better with their audience. Demonstrates honest, direct, and transparent communication and facilitates conversations within the team and its close collaborators. Drives Results: Proactively seeks challenging and differentiated opportunities and drives and motivates team members to take on more responsibility. Resolves Conflict: Balances the business interests of all stakeholders and manages any conflicts offering mutually beneficial options. Certifications PMP (Project Management Professional), PRINCE2 (Projects in Controlled Environments) At YASH, you are empowered to create a career that will take you to where you want to go while working in an inclusive team environment. We leverage career-oriented skilling models and optimize our collective intelligence aided with technology for continuous learning, unlearning, and relearning at a rapid pace and scale. Our Hyperlearning workplace is grounded upon four principles Flexible work arrangements, Free spirit, and emotional positivity Agile self-determination, trust, transparency, and open collaboration All Support needed for the realization of business goals, Stable employment with a great atmosphere and ethical corporate culture Apply now » Apply Now Start apply with LinkedIn Please wait... Find Similar Jobs: Careers Home View All Jobs Top Jobs Quick Links Blogs Events Webinars Media Contact Contact Us Copyright © 2020. YASH Technologies. All Rights Reserved. Show more Show less

Posted 3 days ago

Apply

5.0 years

0 Lacs

Pune, Maharashtra, India

On-site

Linkedin logo

Project Role : Application Lead Project Role Description : Lead the effort to design, build and configure applications, acting as the primary point of contact. Must have skills : SAP BW on HANA Data Modeling & Development Good to have skills : NA Minimum 5 Year(s) Of Experience Is Required Educational Qualification : 15 years full time education Summary: As an Application Lead, you will lead the effort to design, build, and configure applications, acting as the primary point of contact. Your typical day will involve collaborating with various teams to ensure that application requirements are met, overseeing the development process, and providing guidance to team members. You will also engage in problem-solving activities, ensuring that the applications are aligned with business objectives and user needs, while maintaining a focus on quality and efficiency throughout the project lifecycle. Roles & Responsibilities: - Expected to be an SME. - Collaborate and manage the team to perform. - Responsible for team decisions. - Engage with multiple teams and contribute on key decisions. - Provide solutions to problems for their immediate team and across multiple teams. - Facilitate knowledge sharing sessions to enhance team capabilities. - Monitor project progress and ensure timely delivery of milestones. Professional & Technical Skills: - Must to have : SAP ABAP HANA and AMDP skill - Must To Have Skills: Proficiency in SAP BW on HANA Data Modeling & Development. - Strong understanding of data warehousing concepts and best practices. - Experience with ETL processes and data integration techniques. - Familiarity with reporting tools and dashboard creation. - Ability to troubleshoot and optimize data models for performance. Additional Information: - The candidate should have minimum 5 years of experience in SAP BW on HANA Data Modeling & Development. - This position is based at our Pune office. - A 15 years full time education is required. 15 years full time education Show more Show less

Posted 3 days ago

Apply

5.0 years

0 Lacs

Pune, Maharashtra, India

On-site

Linkedin logo

Project Role : Application Lead Project Role Description : Lead the effort to design, build and configure applications, acting as the primary point of contact. Must have skills : SAP BW on HANA Data Modeling & Development Good to have skills : NA Minimum 5 Year(s) Of Experience Is Required Educational Qualification : 15 years full time education Summary: As an Application Lead, you will lead the effort to design, build, and configure applications, acting as the primary point of contact. Your typical day will involve collaborating with various teams to ensure that application requirements are met, overseeing the development process, and providing guidance to team members. You will also engage in problem-solving activities, ensuring that the applications are aligned with business objectives and user needs, while maintaining a focus on quality and efficiency throughout the project lifecycle. Roles & Responsibilities: - Expected to be an SME. - Collaborate and manage the team to perform. - Responsible for team decisions. - Engage with multiple teams and contribute on key decisions. - Provide solutions to problems for their immediate team and across multiple teams. - Facilitate knowledge sharing sessions to enhance team capabilities. - Monitor project progress and ensure timely delivery of milestones. Professional & Technical Skills: - Must to have : SAP ABAP on HANA and AMDP skill - Must To Have Skills: Proficiency in SAP BW on HANA Data Modeling & Development. - Strong understanding of data warehousing concepts and best practices. - Experience with ETL processes and data integration techniques. - Familiarity with reporting tools and dashboard creation. - Ability to troubleshoot and optimize data models for performance. Additional Information: - The candidate should have minimum 5 years of experience in SAP BW on HANA Data Modeling & Development. - This position is based at our Pune office. - A 15 years full time education is required. 15 years full time education Show more Show less

Posted 3 days ago

Apply

7.0 years

0 Lacs

Haveli, Maharashtra, India

On-site

Linkedin logo

We use cookies to offer you the best possible website experience. Your cookie preferences will be stored in your browser’s local storage. This includes cookies necessary for the website's operation. Additionally, you can freely decide and change any time whether you accept cookies or choose to opt out of cookies to improve website's performance, as well as cookies used to display content tailored to your interests. Your experience of the site and the services we are able to offer may be impacted if you do not accept all cookies. Press Tab to Move to Skip to Content Link Skip to main content Home Page Home Page Life At YASH Core Values Careers Business Consulting Jobs Digital Jobs ERP IT Infrastructure Jobs Sales & Marketing Jobs Software Development Jobs Solution Architects Jobs Join Our Talent Community Social Media LinkedIn Twitter Instagram Facebook Search by Keyword Search by Location Home Page Home Page Life At YASH Core Values Careers Business Consulting Jobs Digital Jobs ERP IT Infrastructure Jobs Sales & Marketing Jobs Software Development Jobs Solution Architects Jobs Join Our Talent Community Social Media LinkedIn Twitter Instagram Facebook View Profile Employee Login Search by Keyword Search by Location Show More Options Loading... Requisition ID All Skills All Select How Often (in Days) To Receive An Alert: Create Alert Select How Often (in Days) To Receive An Alert: Apply now » Apply Now Start apply with LinkedIn Please wait... Sr. Software Engineer - Azure Power BI Job Date: Jun 15, 2025 Job Requisition Id: 61602 Location: Pune, IN Pune, MH, IN YASH Technologies is a leading technology integrator specializing in helping clients reimagine operating models, enhance competitiveness, optimize costs, foster exceptional stakeholder experiences, and drive business transformation. At YASH, we’re a cluster of the brightest stars working with cutting-edge technologies. Our purpose is anchored in a single truth – bringing real positive changes in an increasingly virtual world and it drives us beyond generational gaps and disruptions of the future. We are looking forward to hire Power BI Professionals in the following areas : Job Description: The candidate should have good Power BI hands on experience with robust background in Azure data modeling and ETL (Extract, Transform, Load) processes. The candidate should have essential hands-on experience with Advanced SQL and python. Proficiency in building Data Lake and pipelines using Azure. MS Fabric Implementation experience. Additionally, knowledge and experience in Quality domain; and holding Azure certifications, are considered a plus. Required Skills: 7 + years of experience in software engineering, with a focus on data engineering. Proven 5+ year of extensive hands-on experience in Power BI report development. Proven 3+ in data analytics, with a strong focus on Azure data services. Strong experience in data modeling and ETL processes. Advanced Hands-on SQL and Python knowledge and experience working with relational databases for data querying and retrieval. Drive best practices in data engineering, data modeling, data integration, and data visualization to ensure the reliability, scalability, and performance of data solutions. Should be able to work independently end to end and guide other team members. Exposure to Microsoft Fabric is good to have. Good knowledge of SAP and quality processes. Excellent business communication skills. Good data analytical skills to analyze data and understand business requirements. Excellent knowledge of SQL for performing data analysis and performance tuning Ability to test and document end-to-end processes Proficient in MS Office suite (Word, Excel, PowerPoint, Access, Visio) software Proven strong relationship-building and communication skills with team members and business users Excellent communication and presentation skills, with the ability to effectively convey technical concepts to non-technical stakeholders. Partner with business stakeholders to understand their data requirements, challenges, and opportunities, and identify areas where data analytics can drive value. Desired Skills: Extensive hands-on experience with Power BI. Proven experience 5+ in data analytics with a strong focus on Azure data services and Power BI. Exposure to Azure Data Factory, Azure Synapse Analytics, Azure Databricks. Solid understanding of data visualization and engineering principles, including data modeling, ETL/ELT processes, and data warehousing concepts. Experience on Microsoft Fabric is good to have. Strong proficiency in SQL HANA Modelling experience is nice to have. Business objects, Tableau nice to have. Experience of working in Captive is a plus Excellent communication and interpersonal skills, with the ability to effectively collaborate with cross-functional teams and communicate technical concepts to non-technical stakeholders. Strong problem-solving skills and the ability to thrive in a fast-paced, dynamic environment. Responsibilities: Responsible for working with Quality and IT teams to design and implement data solutions. This includes responsibility for the method and processes used to translate business needs into functional and technical specifications. Design, develop, and maintain robust data models, ETL pipelines and visualizations. Responsible for building Power BI reports and dashboards. Responsible for building new Data Lake in Azure, expanding and optimizing our data platform and data pipeline architecture, as well as optimizing data flow and collection for cross-functional teams. Responsible for designing and developing solutions in Azure big data frameworks/tools: Azure Data Lake, Azure Data Factory, Fabric Develop and maintain Python scripts for data processing and automation. Troubleshoot and resolve data-related issues and provide support for escalated technical problems. Process Improvement Ensure data quality and integrity across various data sources and systems. Maintain quality of data in the warehouse, ensuring integrity of data in the warehouse, correcting any data problems Participate in code reviews and contribute to best practices for data engineering. Ensure data security and compliance with relevant regulations and best practices. Develop standards, process flows and tools that promote and facilitate the mapping of data sources, documenting interfaces and data movement across the enterprise. Ensure design meets the requirements Education: IT Graduate (BE, BTech, MCA) preferred At YASH, you are empowered to create a career that will take you to where you want to go while working in an inclusive team environment. We leverage career-oriented skilling models and optimize our collective intelligence aided with technology for continuous learning, unlearning, and relearning at a rapid pace and scale. Our Hyperlearning workplace is grounded upon four principles Flexible work arrangements, Free spirit, and emotional positivity Agile self-determination, trust, transparency, and open collaboration All Support needed for the realization of business goals, Stable employment with a great atmosphere and ethical corporate culture Apply now » Apply Now Start apply with LinkedIn Please wait... Find Similar Jobs: Careers Home View All Jobs Top Jobs Quick Links Blogs Events Webinars Media Contact Contact Us Copyright © 2020. YASH Technologies. All Rights Reserved. Show more Show less

Posted 3 days ago

Apply

3.0 years

0 Lacs

Chennai, Tamil Nadu, India

On-site

Linkedin logo

Vestas is a major player in wind technology and a motivation in the development of the wind power industry. Vestas' core business comprises the development, manufacture, sale, marketing, and maintenance of Wind Turbines. Come and join us at Vestas! Digital Solutions & Development > Digital Architecture & Data & AL , Data Domains & AI > Data Domain - Tech Area Responsibilities Create and maintain scalable data pipelines for analytics use cases assembling large, complex data sets that meet functional & non-functional business requirements Develop logical & physical data models using optimal data model structure for data warehouse and data mart designs to support analytical needs Identify, design, and implement internal process improvements: automating manual processes, optimizing data delivery, re-designing infrastructure for greater scalability Collaborate with technology and platform management partners to optimize data sourcing and processing rules to ensure appropriate data quality Hands-on role (100%) - building data solutions using best practices and architecture recommendations Qualifications Bachelor's / Master's in engineering (Degree in Computer Science, IT, Engineering or similar) Work experience as Data Engineer as part of Data & Analytics team, with 3+ years of relevant work experience and an overall experience of 6-10 years Data Engineering Experience: Advanced working SQL knowledge and experience in building & maintaining scalable ETL/EL data pipelines to support continuing increase in data volume and complexity Enterprise working experience in business intelligence/analytics teams supporting design, development, and maintenance of backend data layer for BI/ML solutions Deep understanding of data structure / data models to design and develop data solutions ensuring data availability, security, and accessibility Competencies Tools/Technologies/Frameworks Expertise in working with various Data Warehouse solutions and constructing data products using technologies such as Snowflake, Databricks, Azure Data Engineering Stack (like storage accounts, key vaults, MS SQL, etc.) is mandatory Strong work experience in SQL/Stored procs and relational modeling to build data layer for BI/analytics is mandatory Extensive hands-on data modelling experience in cloud data warehouse and data structures. Hands-on working experience in one of the ETL/EL tools like DBT/Azure Data Factory/SSIS will be an advantage Proficiency in code management / version control tools such as GIT, DevOps Business/Soft Skills Strong in data/software engineering fundamentals; experience in an Agile/Scrum environment preferred Ability to communicate with stakeholders across different geographies and collaborate with analytics & data science teams to match technical solutions with customer business requirements Familiar with business metrics such as KPIs, PPIs and other indicators Curious and passionate about building value-creating and innovative data solutions What We Offer An opportunity to impact climate change and the future of next generations through data, analytics, cloud and machine learningSteep learning curve. We are building a strong team of Data Engineers with both broad and deep knowledge. That means that everyone will have somebody to learn from, just as we will invest in continuous learning, knowledge sharing and upskilling Strong relationships. We will strive to build an environment of mutual trust and a tightly knit team, where we can support and inspire each other to deliver great impact for Vestas Opportunity to shape your role. We have been asked to scale and deliver data & insights products. The rest is up to us Healthy work life balance. Commitment to fostering a diverse and inclusive workplace environment where everyone can thrive and bring their unique perspectives and skills to the team Overall, we offer you the opportunity to make a difference and work in a multicultural international company, where you have the opportunity to improve your skills and grow professionally to reach new heights Additional Information Your primary workplace will be Chennai. Please note: We do amend or withdraw our jobs and reserve the right to the right to do so at any time, including prior to the advertised closing date. Please be advised to apply on or before 16th July 2025. BEWARE – RECRUITMENT FRAUD It has come to our attention that there are a number of fraudulent emails from people pretending to work for Vestas. Read more via this link, https://www.vestas.com/en/careers/our-recruitment-process DEIB Statement At Vestas, we recognise the value of diversity, equity, and inclusion in driving innovation and success. We strongly encourage individuals from all backgrounds to apply, particularly those who may hesitate due to their identity or feel they do not meet every criterion. As our CEO states, "Expertise and talent come in many forms, and a diverse workforce enhances our ability to think differently and solve the complex challenges of our industry". Your unique perspective is what will help us powering the solution for a sustainable, green energy future. About Vestas Vestas is the energy industry’s global partner on sustainable energy solutions. We are specialised in designing, manufacturing, installing, and servicing wind turbines, both onshore and offshore. Across the globe, we have installed more wind power than anyone else. We consider ourselves pioneers within the industry, as we continuously aim to design new solutions and technologies to create a more sustainable future for all of us. With more than 185 GW of wind power installed worldwide and 40+ years of experience in wind energy, we have an unmatched track record demonstrating our expertise within the field. With 30,000 employees globally, we are a diverse team united by a common goal: to power the solution – today, tomorrow, and far into the future. Vestas promotes a diverse workforce which embraces all social identities and is free of any discrimination. We commit to create and sustain an environment that acknowledges and harvests different experiences, skills, and perspectives. We also aim to give everyone equal access to opportunity. To learn more about our company and life at Vestas, we invite you to visit our website at www.vestas.com and follow us on our social media channels. We also encourage you to join our Talent Universe to receive notifications on new and relevant postings. Show more Show less

Posted 3 days ago

Apply

3.0 years

0 Lacs

Greater Hyderabad Area

On-site

Linkedin logo

Job Overview The Sr. Software Engineer will be part of a team of some of the best and brightest in the industry who are focused on full-cycle development of scalable web and responsive applications that touch our growing customer base every day. As part of the Labs team, you will work collaboratively with agile team members to design new system functionality and to research and remedy complex issues as they arise, embodying a passion for continuous improvement and test-driven development. About Us When you join iCIMS, you join the team helping global companies transform business and the world through the power of talent. Our customers do amazing things: design rocket ships, create vaccines, deliver consumer goods globally, overnight, with a smile. As the Talent Cloud company, we empower these organizations to attract, engage, hire, and advance the right talent. We’re passionate about helping companies build a diverse, winning workforce and about building our home team. We're dedicated to fostering an inclusive, purpose-driven, and innovative work environment where everyone belongs. Responsibilities Design, develop, and maintain scalable data pipelines and ETL processes. Collaborate with software engineers, data scientists, and product managers to understand data requirements and provide tailored solutions. Optimize and enhance the performance of our data infrastructure to support analytics and reporting. Implement and maintain data governance and security best practices. Troubleshoot and resolve data-related issues and ensure data quality and integrity. Mentor and guide junior data engineers, fostering a culture of continuous learning and improvement. Qualifications Bachelor’s or master’s degree in computer science, Engineering, or a related field. 3+ years of experience in data engineering or a similar role. Proficiency in SQL and experience with relational databases (e.g., PostgreSQL, MySQL). Strong programming skills in Python Familiarity with cloud platforms (e.g., AWS, GCP, Azure) and containerization (e.g., Docker). Excellent problem-solving skills and attention to detail. Strong communication and collaboration skills. EEO Statement iCIMS is a place where everyone belongs. We celebrate diversity and are committed to creating an inclusive environment for all employees. Our approach helps us to build a winning team that represents a variety of backgrounds, perspectives, and abilities. So, regardless of how your diversity expresses itself, you can find a home here at iCIMS. We are proud to be an equal opportunity and affirmative action employer. We prohibit discrimination and harassment of any kind based on race, color, religion, national origin, sex (including pregnancy), sexual orientation, gender identity, gender expression, age, veteran status, genetic information, disability, or other applicable legally protected characteristics. If you would like to request an accommodation due to a disability, please contact us at careers@icims.com. Compensation And Benefits Competitive health and wellness benefits include medical insurance (employee and dependent family members), personal accident and group term life insurance, bonding and parental leave, lifestyle spending account reimbursements, wellness services offerings, sick and casual/emergency days, paid holidays, tuition reimbursement, retirals (PF - employer contribution) and gratuity. Benefits and eligibility may vary by location, role, and tenure. Learn more here: https://careers.icims.com/benefits Show more Show less

Posted 3 days ago

Apply

0.0 - 31.0 years

0 - 1 Lacs

Work From Home

On-site

Apna logo

Looking for Power BI Developer - Fresher/Experience Design and develop interactive dashboards and reports using Power BI. Perform data modeling, write DAX expressions, and integrate data from various sources. Collaborate with business teams to gather requirements and deliver actionable insights. Require proficiency in Power BI, SQL, and a basic understanding of ETL processes.

Posted 3 days ago

Apply

6.0 years

0 Lacs

Hyderabad, Telangana, India

On-site

Linkedin logo

Job description Company Description Oncorre Inc. is a boutique digital transformation consultancy headquartered in Bridgewater, New Jersey. Established in 2007, Oncorre provides cutting-edge engineering solutions for Fortune companies and Government Agencies across the USA. Our mission is to help enterprises accelerate adoption of new technologies, untangle complex issues during digital evolution, and orchestrate ongoing innovation. We offer flexible service models, including both onsite and offsite support, to meet our clients' diverse needs. Job Type: Full-time or Contract Start Date: July 01 2025 Role Description This is a full-time hybrid role for a Data Engineer - SQL and Snowflake at Oncorre Inc. The role will involve tasks such as data engineering, data modeling, ETL processes, data warehousing, and data analytics. Job Description: Candidate should Provide technical expertise in needs identification, data modeling, data movement, and translating business needs into technical solutions with adherence to established data guidelines and approaches from a business unit or project perspective. Good knowledge of conceptual, logical, and physical data models, the implementation of RDBMS, operational data store (ODS), data marts, and data lakes on target platforms (SQL/NoSQL). Oversee and govern the expansion of existing data architecture and the optimization of data query performance via best practices. The candidate must be able to work independently and collaboratively · 6+ Years of experience as a Data Engineer · Strong technical expertise in SQL and Snowflake is a must. · Strong knowledge of joins and common table expressions (CTEs) · Strong experience with Python · Strong expertise in ETL process and with various data model concepts · Knowledge of star schema and snowflake schema · Good to know about AWS services such as S3, Athena, Glue, EMR/Spark with a major emphasis on S3 and Glue · Experience with Big Data Tools and technologies Key Skills: · Good Understanding of data structures and data analysis using SQL · Knowledge of implementing ETL/ELT for data solutions end-to-end · Understanding requirements, and data solutions (ingest, storage, integration, processing) · Knowledge of analyzing data using SQL · Conducting End to End verification and validation for the entire application Responsibilities: · Understand and translate business needs into data models supporting long-term solutions. · Perform reverse engineering of physical data models from databases and SQL scripts. · Analyze data-related system integration challenges and propose appropriate solutions. · Assist with and support setting the data architecture direction (including data movement approach, architecture/technology strategy, and any other data-related considerations to ensure business value) Show more Show less

Posted 3 days ago

Apply

0 years

0 Lacs

Bengaluru East, Karnataka, India

On-site

Linkedin logo

Job Summary Customer is seeking a highly skilled Data Engineer with FME expertise who are local residents of Gurugram or Bengaluru or Nagpur. Job Responsibilities 1. Data Integration & Transformation · FME (Safe Software)- Build ETL pipelines to read from Idox/CCF, transform data to given schema · FME Custom Transformers Create reusable rule logic for validations and fixes · Python (in FME or standalone)- Write custom data fix logic, date parsers, validation scripts · Data Profiling Tools-Understand completeness, accuracy, and consistency in batches 2. Spatial Data Handling · PostgreSQL/PostGIS- Store and query spatial data; support dashboard analytics · GeoPackage, GML, GeoJSON, Shapefile- Understand source file formats for ingest/export · Geometry Validators & Fixers- Fix overlaps, slivers, invalid polygons using FME or SQL · Coordinate Systems (e.g., EPSG:27700)- Ensure correct projections and alignment with target systems 3. Automation & Data Workflow Orchestration · FME Server / FME Cloud-Automate batch runs, monitor ETL pipelines · CI/CD / Cron Jobs / Python Scheduling-Trigger ingestion + dashboard refreshes on file upload · Audit Trail & Logging- Log data issues, rule hits, and processing history 4. Dashboard Integration Support · SQL for Views & Aggregations-Support dashboards showing issue counts, trends, maps · Power BI / Grafana / Superset (optional)- Assist in exposing dashboard metrics · Metadata Management- Tag each batch, status, record counts, processing stage 5. Collaborative & Communication Skills · Interpreting Validation Reports- Communicate dashboard findings to Ops and Analysts · Business Rule Translation- Convert requirements into FME transformers or SQL rules · Working with LA and HMLR Specs- Map internal formats to official schemas accurately Essential Skills · Build and maintain FME workflows to transform source data to target data specs · Validate textual and spatial fields using logic embedded in FME or SQL · Support issue triaging and reporting via dashboards · Collaborate with data provider, Analysts, and Ops for continuous improvement · ETL / Integration FME, Talend (optional), Python · Spatial DB PostGIS, Oracle Spatial · GIS Tools QGIS, ArcGIS · Scripting Python, SQL · Validation FME Testers, AttributeValidator, custom SQL views · Format Support CSV, JSON, GPKG, XML, Shapefiles · Coordination Jira, Confluence, Git (for rule versioning) Background Check required No criminal record Others · Bachelor of Engineering - Bachelor of Technology (B.E./B.Tech.) · Work Location- Onsite in Gurugram or Bengaluru or Nagpur · Only local candidates apply Show more Show less

Posted 3 days ago

Apply

8.0 years

0 Lacs

Hyderabad, Telangana, India

Remote

Linkedin logo

Job Summary We are looking for an accomplished and dynamic Data Engineering Lead to join our team and drive the design, development, and delivery of cutting-edge data solutions. This role requires a balance of strong technical expertise, strategic leadership, and a consulting mindset. As the Lead Data Engineer, you will oversee the design and development of robust data pipelines and systems, manage and mentor a team of 5 to 7 engineers, and play a critical role in architecting innovative solutions tailored to client needs. You will lead by example, fostering a culture of accountability, ownership, and continuous improvement while delivering impactful, scalable data solutions in a fast-paced, consulting environment. Job Responsibilities Client Collaboration: · Act as the primary point of contact for US-based clients, ensuring alignment on project goals, timelines, and deliverables. · Engage with stakeholders to understand requirements and ensure alignment throughout the project lifecycle. · Present technical concepts and designs to both technical and non-technical audiences. · Communicate effectively with stakeholders to ensure alignment on project goals, timelines, and deliverables. · Set realistic expectations with clients and proactively address concerns or risks. Data Solution Design and Development: · Architect, design, and implement end-to-end data pipelines and systems that handle large-scale, complex datasets. · Ensure optimal system architecture for performance, scalability, and reliability. · Evaluate and integrate new technologies to enhance existing solutions. · Implement best practices in ETL/ELT processes, data integration, and data warehousing. Project Leadership and Delivery: · Lead technical project execution, ensuring timelines and deliverables are met with high quality. · Collaborate with cross-functional teams to align business goals with technical solutions. · Act as the primary point of contact for clients, translating business requirements into actionable technical strategies. Team Leadership and Development: · Manage, mentor, and grow a team of 5 to 7 data engineers; Ensure timely follow-ups on action items and maintain seamless communication across time zones. · Conduct code reviews, validations, and provide feedback to ensure adherence to technical standards. · Provide technical guidance and foster an environment of continuous learning, innovation, and collaboration. · Support collaboration and alignment between the client and delivery teams. Optimization and Performance Tuning: · Be hands-on in developing, testing, and documenting data pipelines and solutions as needed. · Analyze and optimize existing data workflows for performance and cost-efficiency. · Troubleshoot and resolve complex technical issues within data systems. Adaptability and Innovation: · Embrace a consulting mindset with the ability to quickly learn and adopt new tools, technologies, and frameworks. · Identify opportunities for innovation and implement cutting-edge technologies in data engineering. · Exhibit a "figure it out" attitude, taking ownership and accountability for challenges and solutions. Learning and Adaptability: · Stay updated with emerging data technologies, frameworks, and tools. · Actively explore and integrate new technologies to improve existing workflows and solutions. Internal Initiatives and Eminence Building: · Drive internal initiatives to improve processes, frameworks, and methodologies. · Contribute to the organization’s eminence by developing thought leadership, sharing best practices, and participating in knowledge-sharing activities. Essential Skills Qualifications Education: · Bachelor’s or master’s degree in computer science, Data Engineering, or a related field. · Certifications in cloud platforms such as Snowflake Snowpro, Data Engineer is a plus. Experience: · 8+ years of experience in data engineering with hands-on expertise in data pipeline development, architecture, and system optimization · Demonstrated success in managing global teams, especially across US and India time zones. · Proven track record in leading data engineering teams and managing end-to-end project delivery. · Strong background in data warehousing and familiarity with tools such as Matillion, dbt, Striim, etc. Technical Skills: · Lead the design, development, and deployment of scalable data architectures, pipelines, and processes tailored to client needs · Expertise in programming languages such as Python, Scala, or Java. · Proficiency in designing and delivering data pipelines in Cloud Data Warehouses (e.g., Snowflake, Redshift), using various ETL/ELT tools such as Matillion, dbt, Striim, etc. · Solid understanding of database systems (relational and NoSQL) and data modeling techniques. · Hands-on experience of 2+ years in designing and developing data integration solutions using Matillion and/or dbt. · Strong knowledge of data engineering and integration frameworks. · Expertise in architecting data solutions. · Successfully implemented at least two end-to-end projects with multiple transformation layers. · Good grasp of coding standards, with the ability to define standards and testing strategies for projects. · Proficiency in working with cloud platforms (AWS, Azure, GCP) and associated data services. · Enthusiastic about working in Agile methodology. · Possess a comprehensive understanding of the DevOps process including GitHub integration and CI/CD pipelines. Soft Skills: · Exceptional problem-solving and analytical skills. · Strong communication and interpersonal skills to manage client relationships and team dynamics. · Ability to thrive in a consulting environment, quickly adapting to new challenges and domains. · Ability to handle ambiguity and proactively take ownership of challenges. · Demonstrated accountability, ownership, and a proactive approach to solving problems. Background Check required No criminal record Others · Bachelor’s or master’s degree in computer science, Engineering, or related field, or equivalent practical experience · There are 2-3 rounds in the interview process. · This is 5 days work from office role (No Hybrid/ Remote options available) Show more Show less

Posted 3 days ago

Apply

Exploring ETL Jobs in India

The ETL (Extract, Transform, Load) job market in India is thriving with numerous opportunities for job seekers. ETL professionals play a crucial role in managing and analyzing data effectively for organizations across various industries. If you are considering a career in ETL, this article will provide you with valuable insights into the job market in India.

Top Hiring Locations in India

  1. Bangalore
  2. Mumbai
  3. Pune
  4. Hyderabad
  5. Chennai

These cities are known for their thriving tech industries and often have a high demand for ETL professionals.

Average Salary Range

The average salary range for ETL professionals in India varies based on experience levels. Entry-level positions typically start at around ₹3-5 lakhs per annum, while experienced professionals can earn upwards of ₹10-15 lakhs per annum.

Career Path

In the ETL field, a typical career path may include roles such as: - Junior ETL Developer - ETL Developer - Senior ETL Developer - ETL Tech Lead - ETL Architect

As you gain experience and expertise, you can progress to higher-level roles within the ETL domain.

Related Skills

Alongside ETL, professionals in this field are often expected to have skills in: - SQL - Data Warehousing - Data Modeling - ETL Tools (e.g., Informatica, Talend) - Database Management Systems (e.g., Oracle, SQL Server)

Having a strong foundation in these related skills can enhance your capabilities as an ETL professional.

Interview Questions

Here are 25 interview questions that you may encounter in ETL job interviews:

  • What is ETL and why is it important? (basic)
  • Explain the difference between ETL and ELT processes. (medium)
  • How do you handle incremental loads in ETL processes? (medium)
  • What is a surrogate key in the context of ETL? (basic)
  • Can you explain the concept of data profiling in ETL? (medium)
  • How do you handle data quality issues in ETL processes? (medium)
  • What are some common ETL tools you have worked with? (basic)
  • Explain the difference between a full load and an incremental load. (basic)
  • How do you optimize ETL processes for performance? (medium)
  • Can you describe a challenging ETL project you worked on and how you overcame obstacles? (advanced)
  • What is the significance of data cleansing in ETL? (basic)
  • How do you ensure data security and compliance in ETL processes? (medium)
  • Have you worked with real-time data integration in ETL? If so, how did you approach it? (advanced)
  • What are the key components of an ETL architecture? (basic)
  • How do you handle data transformation requirements in ETL processes? (medium)
  • What are some best practices for ETL development? (medium)
  • Can you explain the concept of change data capture in ETL? (medium)
  • How do you troubleshoot ETL job failures? (medium)
  • What role does metadata play in ETL processes? (basic)
  • How do you handle complex transformations in ETL processes? (medium)
  • What is the importance of data lineage in ETL? (basic)
  • Have you worked with parallel processing in ETL? If so, explain your experience. (advanced)
  • How do you ensure data consistency across different ETL jobs? (medium)
  • Can you explain the concept of slowly changing dimensions in ETL? (medium)
  • How do you document ETL processes for knowledge sharing and future reference? (basic)

Closing Remarks

As you explore ETL jobs in India, remember to showcase your skills and expertise confidently during interviews. With the right preparation and a solid understanding of ETL concepts, you can embark on a rewarding career in this dynamic field. Good luck with your job search!

cta

Start Your Job Search Today

Browse through a variety of job opportunities tailored to your skills and preferences. Filter by location, experience, salary, and more to find your perfect fit.

Job Application AI Bot

Job Application AI Bot

Apply to 20+ Portals in one click

Download Now

Download the Mobile App

Instantly access job listings, apply easily, and track applications.

Featured Companies