Home
Jobs

10660 Etl Jobs - Page 27

Filter Interviews
Min: 0 years
Max: 25 years
Min: ₹0
Max: ₹10000000
Setup a job Alert
Filter
JobPe aggregates results for easy application access, but you actually apply on the job portal directly.

7.0 years

5 - 23 Lacs

Hyderābād

On-site

GlassDoor logo

Job Title: Senior Data Engineer Experience: 7+ years Location: Hyderabad, Telangana Time Zone: IST Primary Tech Stack: SQL, Query & Database Performance Tuning, ETL, Integrations & Data Transformations, Python Scripting, AWS Core Services (S3, Lambda, IAM) General Information: We are looking for exceptional Senior Data Engineers (SDEs) to play a significant role in building our large-scale, high-volume, high-performance data integration and delivery services. These data solutions would be primarily used in periodic reporting and drive business decision-making while dealing efficiently with the massive scale of data available through our Data Warehouse as well as our software systems. You will be responsible for designing and implementing solutions using third-party and in-house data processing tools, building dimensional data models, reports, and dashboards, integrating data across disparate & distributed systems, and administering the platform software. You are expected to analyze challenging Business Problems and build efficient, flexible, extensible, and scalable data models, ETL designs, and data integration services. You will also have an opportunity to build/maintain/enhance small to mid-size custom-built Applications using Python/Java. You are required to support and manage the growth of these data solutions. Job Description: As a Data Engineer, you will be working in one of the world's largest cloud-based data lakes. You should be skilled in the architecture of data warehouse solutions for the Enterprise using multiple platforms (EMR, RDBMS, Columnar, Cloud). You should have extensive experience in the design, creation, management, and business use of extremely large datasets. You should have excellent business and communication skills to be able to work with business owners to develop and define key business questions and to build data sets that answer those questions. Above all, you should be passionate about working with huge data volumes and someone who loves to bring datasets together to answer business questions to drive Business growth. Skills Needed SQL Expert Query & Database Performance Tuning Expert ETL, Integrations & Data Transformations Proficient Python Scripting Proficient AWS Core Services (S3, Lambda, IAM) Intermediate Job Type: Full-time Pay: ₹500,298.14 - ₹2,350,039.92 per year Work Location: In person

Posted 2 days ago

Apply

10.0 years

7 - 20 Lacs

India

On-site

GlassDoor logo

About MostEdge At MostEdge , we’re on a mission to accelerate commerce and build sustainable, trusted experiences . Our slogan — Protect Every Penny. Power Every Possibility. —reflects our commitment to operational excellence, data integrity, and real-time intelligence that help retailers run smarter, faster, and stronger. Our systems are mission-critical and designed for 99.99999% uptime , powering millions of transactions and inventory updates daily . We work at the intersection of AI, microservices, and retail commerce—and we win as a team. Role Overview We are looking for a Senior Database Administrator (DBA) to own the design, implementation, scaling, and performance of our data infrastructure. You will be responsible for mission-critical OLTP systems spanning MariaDB, MySQL, PostgreSQL, and MongoDB , deployed across AWS, GCP, and containerized Kubernetes clusters . This role plays a key part in ensuring data consistency, security, and speed across billions of rows and real-time operations. Scope & Accountability What You Will Own Manage and optimize multi-tenant, high-availability databases for real-time inventory, pricing, sales, and vendor data. Design and maintain scalable, partitioned database architectures across SQL and NoSQL systems. Monitor and tune query performance and ensure fast recovery, replication, and backup practices. Partner with developers, analysts, and DevOps teams on schema design, ETL pipelines, and microservices integration . Maintain security best practices, audit logging, encryption standards, and data retention compliance . What Success Looks Like 99.99999% uptime maintained across all environments. <100ms query response times for large-scale datasets. Zero unplanned data loss or corruption incidents. Developer teams experience zero bottlenecks from DB-related delays. Skills & Experience Must-Have 10+ years of experience managing OLTP systems at scale. Strong hands-on with MySQL, MariaDB, PostgreSQL, and MongoDB . Proven expertise in replication, clustering, indexing, and sharding . Experience with Kubernetes-based deployments , Kafka queues , and Dockerized apps . Familiarity with AWS S3 storage , GCP services, and hybrid cloud data replication. Experience in startup environments with fast-moving agile teams. Track record of creating clear documentation and managing tasks via JIRA . Nice-to-Have Experience with AI/ML data pipelines , vector databases, or embedding stores. Exposure to infrastructure as code (e.g., Terraform, Helm). Familiarity with LangChain, FastAPI , or modern LLM-driven architectures. How You Reflect Our Values Lead with Purpose : You enable smarter, faster systems that empower our retail customers. Build Trust : You create safe, accurate, and recoverable environments. Own the Outcome : You take responsibility for uptime, audits, and incident resolution. Win Together : You collaborate seamlessly across product, ops, and engineering. Keep It Simple : You design intuitive schemas, efficient queries, and clear alerts. Why Join MostEdge? Work on high-impact systems powering real-time retail intelligence . Collaborate with a passionate, values-driven team across AI, engineering, and operations. Build at scale—with autonomy, ownership, and cutting-edge tech. Job Types: Full-time, Permanent Pay: ₹727,996.91 - ₹2,032,140.73 per year Benefits: Health insurance Life insurance Paid sick time Paid time off Provident Fund Schedule: Evening shift Morning shift US shift Supplemental Pay: Performance bonus Yearly bonus Work Location: In person Expected Start Date: 31/07/2025

Posted 2 days ago

Apply

5.0 years

10 - 27 Lacs

India

On-site

GlassDoor logo

About MostEdge At MostEdge , our purpose is clear: Accelerate commerce and build sustainable, trusted experiences. With every byte of data, we strive to Protect Every Penny. Power Every Possibility. We empower retailers to make real-time, profitable decisions using cutting-edge AI , smart infrastructure, and operational excellence. Our platforms handle: hundreds of thousands of sales transactions/hour hundreds of vendor purchase invoices/hour few hundred product updates/day With systems built for 99.99999% uptime We are building an AI-native commerce engine , and language models are at the heart of this transformation. Role Overview We are looking for an AI/ML Expert with deep experience in training and deploying Large Language Models (LLMs) to power MostEdge's next-generation operations, cost intelligence, and customer analytics platform . You will be responsible for fine-tuning domain-specific models using internal structured and unstructured data (product catalogs, invoices, chats, documents), embedding real-time knowledge through RAG pipelines, and enabling AI-powered interfaces that drive search, reporting, insight generation, and operational recommendations. Scope & Accountability What You Will Own Fine-tune and deploy LLMs for product, vendor, and shopper-facing use cases. Design hybrid retrieval-augmented generation (RAG) pipelines with LangChain, FastAPI, and vector DBs (e.g., FAISS, Weaviate, Qdrant). Train models on internal datasets (sales, cost, product specs, invoices, support logs) using supervised fine-tuning and LoRA/QLoRA techniques. Orchestrate embedding pipelines, prompt tuning, and model evaluation across customer and field operations use cases. Deploy LLMs efficiently on RunPod, AWS, or GCP , optimizing for multi-GPU, low-latency inference . Collaborate with engineering and product teams to embed model outputs in dashboards, chat UIs, and retail systems. What Success Looks Like 90%+ accuracy on retrieval and reasoning tasks for product/vendor cost and invoice queries. <3s inference time across operational prompts, running on GPU-optimized containers. Full integration of LLMs with backend APIs, sales dashboards, and product portals. 75% reduction in manual effort across selected operational workflows. Skills & Experience Must-Have 5+ years in AI/ML , with 2+ years working on LLMs or transformer architectures . Proven experience training or fine-tuning Mistral, LLaMA, Falcon, or similar open-source LLMs . Strong command over LoRA, QLoRA, PEFT, RAG, embeddings, and quantized inference . Familiarity with LangChain, HuggingFace Transformers, FAISS/Qdrant , and FastAPI for LLM orchestration. Experience deploying models on RunPod, AWS, or GCP using Docker + Kubernetes. Proficient in Python , PyTorch , and data preprocessing (structured and unstructured). Experience with ETL pipelines , multi-modal data, and real-time data integration. Nice-to-Have Experience with retail, inventory, or customer analytics systems . Knowledge of semantic search, OCR post-processing, or auto-tagging pipelines . Exposure to multi-tenant environments and secure model isolation for enterprise use. How You Reflect Our Values Lead with Purpose : You empower smarter decisions with AI-first operations. Build Trust : You make model behavior explainable, dependable, and fair. Own the Outcome : You train and optimize end-to-end pipelines from data to insights. Win Together : You partner across engineering, ops, and customer success teams. Keep It Simple : You design intuitive models, prompts, and outputs that drive action—not confusion. Why Join MostEdge? Shape how AI transforms commerce and operations at scale . Be part of a mission-critical, high-velocity, AI-first company . Build LLMs with purpose—connecting frontline data to real-time results. Job Types: Full-time, Permanent Pay: ₹1,068,726.69 - ₹2,729,919.70 per year Benefits: Health insurance Life insurance Paid sick time Paid time off Provident Fund Schedule: Evening shift Morning shift US shift Supplemental Pay: Performance bonus Yearly bonus Work Location: In person Expected Start Date: 15/07/2025

Posted 2 days ago

Apply

3.0 years

0 Lacs

Hyderābād

On-site

GlassDoor logo

- 3+ years of data engineering experience - Experience with data modeling, warehousing and building ETL pipelines - 4+ years of SQL experience - Experience in at least one modern scripting or programming language, such as Python, Java, Scala, or NodeJS - Experience as a data engineer or related specialty (e.g., software engineer, business intelligence engineer, data scientist) with a track record of manipulating, processing, and extracting value from large datasets Design, implement, and support data warehouse / data lake infrastructure using AWS big data stack, Python, Redshift, Quicksight, Glue/lake formation, EMR/Spark/Scala, Athena etc. • Extract huge volumes of structured and unstructured data from various sources (Relational /Non-relational/No-SQL database) and message streams and construct complex analyses. • Develop and manage ETLs to source data from various systems and create unified data model for analytics and reporting • Perform detailed source-system analysis, source-to-target data analysis, and transformation analysis • Participate in the full development cycle for ETL: design, implementation, validation, documentation, and maintenance. Experience with AWS technologies like Redshift, S3, AWS Glue, EMR, Kinesis, FireHose, Lambda, and IAM roles and permissions Experience with non-relational databases / data stores (object storage, document or key-value stores, graph databases, column-family databases) Experience building/operating highly available, distributed systems of data extraction, ingestion, and processing of large data sets Our inclusive culture empowers Amazonians to deliver the best results for our customers. If you have a disability and need a workplace accommodation or adjustment during the application and hiring process, including support for the interview or onboarding process, please visit https://amazon.jobs/content/en/how-we-hire/accommodations for more information. If the country/region you’re applying in isn’t listed, please contact your Recruiting Partner.

Posted 2 days ago

Apply

0 years

4 - 6 Lacs

Hyderābād

On-site

GlassDoor logo

As an employee at Thomson Reuters, you will play a role in shaping and leading the global knowledge economy. Our technology drives global markets and helps professionals around the world make decisions that matter. As the world’s leading provider of intelligent information, we want your unique perspective to create the solutions that advance our business and your career.Our Service Management function is transforming into a truly global, data and standards-driven organization, employing best-in-class tools and practices across all disciplines of Technology Operations. This will drive ever-greater stability and consistency of service across the technology estate as we drive towards optimal Customer and Employee experience. About the role: In this opportunity as Application Support Analyst, you will: Experience on Informatica support. The engineer will be responsible for supporting Informatica Development, Extractions, and loading. Fixing the data discrepancies and take care of performance monitoring. Collaborate with stakeholders such as business teams, product owners, and project management in defining roadmaps for applications and processes. Drive continual service improvement and innovation in productivity, software quality, and reliability, including meeting/exceeding SLAs. Thorough understanding of ITIL processes related to incident management, problem management, application life cycle management, operational health management. Experience in supporting applications built on modern application architecture and cloud infrastructure, Informatica PowerCenter/IDQ, Javascript frameworks and Libraries, HTML/CSS/JS, Node.JS, TypeScript, jQuery, Docker, AWS/Azure. About You: You're a fit for the role of Application Support Analyst - Informatica if your background includes: 3 to 8+ experienced Informatica Developer and Support will be responsible for implementation of ETL methodology in Data Extraction, Transformation and Loading. Have Knowledge in ETL Design of new or changing mappings and workflows with the team and prepares technical specifications. Should have experience in creating ETL Mappings, Mapplets, Workflows, Worklets using Informatica PowerCenter 10.x and prepare corresponding documentation. Designs and builds integrations supporting standard data warehousing objects (type-2 dimensions, aggregations, star schema, etc.). Should be able to perform source system analysis as required. Works with DBAs and Data Architects to plan and implement appropriate data partitioning strategy in Enterprise Data Warehouse. Implements versioning of the ETL repository and supporting code as necessary. Develops stored procedures, database triggers and SQL queries where needed. Implements best practices and tunes SQL code for optimization. Loads data from SF Power Exchange to Relational database using Informatica. Works with XML's, XML parser, Java and HTTP transformation within Informatica. Experience in Integration of various data sources like Oracle, SQL Server, DB2 and Flat Files in various formats like fixed width, CSV, Salesforce and excel Manage. Have in depth knowledge and experience in implementing the best practices for design and development of data warehouses using Star schema & Snowflake schema design concepts. Experience in Performance Tuning of sources, targets, mappings, transformations, and sessions Carried out support and development activities in a relational database environment, designed tables, procedures/Functions, Packages, Triggers and Views in relational databases and used SQL proficiently in database programming using SNFL Thousand Coffees Thomson Reuters café networking. #LI-VGA1 What’s in it For You? Hybrid Work Model: We’ve adopted a flexible hybrid working environment (2-3 days a week in the office depending on the role) for our office-based roles while delivering a seamless experience that is digitally and physically connected. Flexibility & Work-Life Balance: Flex My Way is a set of supportive workplace policies designed to help manage personal and professional responsibilities, whether caring for family, giving back to the community, or finding time to refresh and reset. This builds upon our flexible work arrangements, including work from anywhere for up to 8 weeks per year, empowering employees to achieve a better work-life balance. Career Development and Growth: By fostering a culture of continuous learning and skill development, we prepare our talent to tackle tomorrow’s challenges and deliver real-world solutions. Our Grow My Way programming and skills-first approach ensures you have the tools and knowledge to grow, lead, and thrive in an AI-enabled future. Industry Competitive Benefits: We offer comprehensive benefit plans to include flexible vacation, two company-wide Mental Health Days off, access to the Headspace app, retirement savings, tuition reimbursement, employee incentive programs, and resources for mental, physical, and financial wellbeing. Culture: Globally recognized, award-winning reputation for inclusion and belonging, flexibility, work-life balance, and more. We live by our values: Obsess over our Customers, Compete to Win, Challenge (Y)our Thinking, Act Fast / Learn Fast, and Stronger Together. Social Impact: Make an impact in your community with our Social Impact Institute. We offer employees two paid volunteer days off annually and opportunities to get involved with pro-bono consulting projects and Environmental, Social, and Governance (ESG) initiatives. Making a Real-World Impact: We are one of the few companies globally that helps its customers pursue justice, truth, and transparency. Together, with the professionals and institutions we serve, we help uphold the rule of law, turn the wheels of commerce, catch bad actors, report the facts, and provide trusted, unbiased information to people all over the world. About Us Thomson Reuters informs the way forward by bringing together the trusted content and technology that people and organizations need to make the right decisions. We serve professionals across legal, tax, accounting, compliance, government, and media. Our products combine highly specialized software and insights to empower professionals with the data, intelligence, and solutions needed to make informed decisions, and to help institutions in their pursuit of justice, truth, and transparency. Reuters, part of Thomson Reuters, is a world leading provider of trusted journalism and news. We are powered by the talents of 26,000 employees across more than 70 countries, where everyone has a chance to contribute and grow professionally in flexible work environments. At a time when objectivity, accuracy, fairness, and transparency are under attack, we consider it our duty to pursue them. Sound exciting? Join us and help shape the industries that move society forward. As a global business, we rely on the unique backgrounds, perspectives, and experiences of all employees to deliver on our business goals. To ensure we can do that, we seek talented, qualified employees in all our operations around the world regardless of race, color, sex/gender, including pregnancy, gender identity and expression, national origin, religion, sexual orientation, disability, age, marital status, citizen status, veteran status, or any other protected classification under applicable law. Thomson Reuters is proud to be an Equal Employment Opportunity Employer providing a drug-free workplace. We also make reasonable accommodations for qualified individuals with disabilities and for sincerely held religious beliefs in accordance with applicable law. More information on requesting an accommodation here. Learn more on how to protect yourself from fraudulent job postings here. More information about Thomson Reuters can be found on thomsonreuters.com.

Posted 2 days ago

Apply

5.0 years

0 Lacs

Hyderābād

On-site

GlassDoor logo

SUMMARY The Master Data Management (MDM) Administrator will play a critical role in engaging with stakeholders and technical team members to execute master data creation, maintenance, and governance for our MDM workstream. This position will play a crucial role in managing our master data, ensuring data consistency, and facilitating data-driven decision making. You will collaborate with various departments to ensure data accuracy, integrity, and compliance with established data standards. This position will play a key role in streamlining data-related processes, enhancing data quality, and promoting a data driven culture within the company.This role will report to the BEST Data Services Senior Manager in our Business Enterprise Systems Technology department. The successful candidate will take a hands-on approach and will be assisting developers and architects with our Master Data Management (MDM) platform and FaCT Data Foundations teams and processes. A successful MDM Administrator must take a hands-on approach, ensuring the highest quality solutions are provided to our business stakeholders, with accurate development, documentation, and adherence to deadlines. This role will also work with key stakeholders across the organization to drive enhancements to a successful implementation and ensure all master data meets requirements and are deployed and implemented properly. PRIMARY RESPONSIBILITIES Responsible for engaging with multiple teams (both technical and non) to understand master data requirements and objectives. Responsible for implementing and enforcing data governance policies and procedures to maintain the quality and integrity of master data. Responsible for performing data entry, validation, and maintenance tasks to ensure accuracy and consistency of master data records. Develop and maintain data standards and guidelines for various data elements to be used consistently across the organization. Assist in collaborating with multiple teams to define and implement various data structures and hierarchies with the Customer, Product, Pricing, and Supplier data domains. Identify and resolve data quality issues, including duplication, inconsistency, and inaccuracies. Facilitate data integration and migration projects, ensuring seamless data flows between systems. Maintain comprehensive documentation of data processes, standards, and best practices. Generate reports and analyze data quality metrics to monitor the effectiveness of data management efforts. Provide training and support to end-users on data entry and data management best practices. Ensure that master data management practices align with industry regulations and compliance requirements. Provide timely troubleshooting and support for master data related problems. Ensure data security and compliance with relevant regulations and internal policies. Responsible for ensuring there is alignment with business objectives. Responsible for identifying and resolve data discrepancies, ensuring data standards are met. Working closely with FaCT, IT, and business stakeholders to ensure seamless data migration. \ Effectively communicate project status, issues, and solutions to both technical and non-technical stakeholders. Maintain detailed documentation of data migration processes, decisions, and outcomes. Provide post migration support. Adopt a proactive, hands-on approach to resolve any issues related to these platforms. Collaborate with onshore and offshore business and technical teams to assist with creating solutions for complex internal business operations. Work closely with business partners to define strategies for technical solutions, determine requirements, and develop high-level designs. REQUIRED KNOWLEDGE/SKILLS/ABILITIES Minimum of 5 years of hands-on master data management administration experience with a focus on customer, pricing, and product data domains. Oracle CX-Sales(CDM), experience in VBCS with AR Module(O2C). Knowledge of Integration OIC & ATP is a plus. Knowledge of data quality and data profiling tools. Familiarity with data integration and ETL processes. Strong understanding of data structures, databases, and data integrations. Strong communication skills Proficient in designing and implementing process workflows and data diagrams. Proven Agile development experience, you can understand what Epics, Features and Stories are and can define one. Excellent problem solver and independent thinker who can create innovative solutions. Exceptional communication, analytical, and management skills, with the ability to present technical concepts to both business executives and technical teams. Able to manage daily stand-ups, escalations, issues, and risks. Self-directed, adaptable, empathetic, flexible, and forward-thinking. Strong organizational, interpersonal, and relationship-building skills conducive to collaboration. Passionate about technology, digital transformation, and business process reengineering.

Posted 2 days ago

Apply

3.0 - 6.0 years

0 Lacs

Hyderābād

On-site

GlassDoor logo

We are seeking a talented and detail-oriented Data Analyst to join our Reporting Team. In this role, you will specialize in curating insightful and visually compelling reports using tools such as Power BI, Tableau, Python, Excel, and PowerPoint. A key component of this position is integrating AI solutions into our reporting processes to enhance data-driven decision-making for our stakeholders. Collaboration with stakeholders is essential to ensure our reporting solutions effectively meet their needs. If you are passionate about data visualization and leveraging AI technologies, we would love to hear from you! About the Role In this opportunity as a Data Analyst, you will: Develop, design, and maintain interactive and dynamic reports and dashboards using Power BI, Tableau, Excel, and PowerPoint. Collaborate closely with stakeholders to understand their reporting needs, deliver actionable insights, and ensure satisfaction. Utilize AI and machine learning techniques to enhance reporting solutions and provide predictive insights. Analyze complex datasets to identify trends, patterns, and anomalies that can inform business decisions. Ensure data integrity and accuracy in all reporting solutions. Provide training and support to team members and stakeholders on the use of reporting tools and AI technologies. Continuously seek opportunities to improve reporting processes and tools, staying updated with the latest industry trends and technologies. Communicate findings and recommendations to stakeholders through clear and concise presentations and reports. About you: You’re a fit for the role of Data Analyst if you: Bachelor’s degree in Data Science, Computer Science, Statistics, Business Analytics, or a related field. 3-6 years of experience as a Data Analyst or in a similar role, with a strong portfolio of reporting and dashboard projects. Proficiency in Power BI, Tableau, Python, Excel, and PowerPoint. Experience with AI technologies and machine learning algorithms is needed. Strong data analytical skills with the ability to collect, organize, analyze, and disseminate significant amounts of information with attention to detail and accuracy. Excellent communication and presentation skills. Ability to work collaboratively in a team environment as well as independently. Experience with programming languages such as Python or R. Familiarity with SQL for data extraction and manipulation. Knowledge of data warehousing, ETL processes, LLMs.. #LI-SS6 What’s in it For You? Hybrid Work Model: We’ve adopted a flexible hybrid working environment (2-3 days a week in the office depending on the role) for our office-based roles while delivering a seamless experience that is digitally and physically connected. Flexibility & Work-Life Balance: Flex My Way is a set of supportive workplace policies designed to help manage personal and professional responsibilities, whether caring for family, giving back to the community, or finding time to refresh and reset. This builds upon our flexible work arrangements, including work from anywhere for up to 8 weeks per year, empowering employees to achieve a better work-life balance. Career Development and Growth: By fostering a culture of continuous learning and skill development, we prepare our talent to tackle tomorrow’s challenges and deliver real-world solutions. Our Grow My Way programming and skills-first approach ensures you have the tools and knowledge to grow, lead, and thrive in an AI-enabled future. Industry Competitive Benefits: We offer comprehensive benefit plans to include flexible vacation, two company-wide Mental Health Days off, access to the Headspace app, retirement savings, tuition reimbursement, employee incentive programs, and resources for mental, physical, and financial wellbeing. Culture: Globally recognized, award-winning reputation for inclusion and belonging, flexibility, work-life balance, and more. We live by our values: Obsess over our Customers, Compete to Win, Challenge (Y)our Thinking, Act Fast / Learn Fast, and Stronger Together. Social Impact: Make an impact in your community with our Social Impact Institute. We offer employees two paid volunteer days off annually and opportunities to get involved with pro-bono consulting projects and Environmental, Social, and Governance (ESG) initiatives. Making a Real-World Impact: We are one of the few companies globally that helps its customers pursue justice, truth, and transparency. Together, with the professionals and institutions we serve, we help uphold the rule of law, turn the wheels of commerce, catch bad actors, report the facts, and provide trusted, unbiased information to people all over the world. About Us Thomson Reuters informs the way forward by bringing together the trusted content and technology that people and organizations need to make the right decisions. We serve professionals across legal, tax, accounting, compliance, government, and media. Our products combine highly specialized software and insights to empower professionals with the data, intelligence, and solutions needed to make informed decisions, and to help institutions in their pursuit of justice, truth, and transparency. Reuters, part of Thomson Reuters, is a world leading provider of trusted journalism and news. We are powered by the talents of 26,000 employees across more than 70 countries, where everyone has a chance to contribute and grow professionally in flexible work environments. At a time when objectivity, accuracy, fairness, and transparency are under attack, we consider it our duty to pursue them. Sound exciting? Join us and help shape the industries that move society forward. As a global business, we rely on the unique backgrounds, perspectives, and experiences of all employees to deliver on our business goals. To ensure we can do that, we seek talented, qualified employees in all our operations around the world regardless of race, color, sex/gender, including pregnancy, gender identity and expression, national origin, religion, sexual orientation, disability, age, marital status, citizen status, veteran status, or any other protected classification under applicable law. Thomson Reuters is proud to be an Equal Employment Opportunity Employer providing a drug-free workplace. We also make reasonable accommodations for qualified individuals with disabilities and for sincerely held religious beliefs in accordance with applicable law. More information on requesting an accommodation here. Learn more on how to protect yourself from fraudulent job postings here. More information about Thomson Reuters can be found on thomsonreuters.com.

Posted 2 days ago

Apply

6.0 years

10 Lacs

Hyderābād

On-site

GlassDoor logo

Experience- 6+ years Work Mode- Hybrid Job Summary: We are seeking a skilled Informatica ETL Developer with 5+ years of experience in ETL and Business Intelligence projects. The ideal candidate will have a strong background in Informatica PowerCenter , a solid understanding of data warehousing concepts , and hands-on experience in SQL, performance tuning , and production support . This role involves designing and maintaining robust ETL pipelines to support digital transformation initiatives for clients in manufacturing, automotive, transportation, and engineering domains. Key Responsibilities: Design, develop, and maintain ETL workflows using Informatica PowerCenter . Troubleshoot and optimize ETL jobs for performance and reliability. Analyze complex data sets and write advanced SQL queries for data validation and transformation. Collaborate with data architects and business analysts to implement data warehousing solutions . Apply SDLC methodologies throughout the ETL development lifecycle. Support production environments by identifying and resolving data and performance issues. Work with Unix shell scripting for job automation and scheduling. Contribute to the design of technical architectures that support digital transformation. Required Skills: 3–5 years of hands-on experience with Informatica PowerCenter . Proficiency in SQL and familiarity with NoSQL platforms . Experience in ETL performance tuning and troubleshooting . Solid understanding of Unix/Linux environments and scripting. Excellent verbal and written communication skills. Preferred Qualifications: AWS Certification or experience with cloud-based data integration is a plus. Exposure to data modeling and data governance practices. Job Type: Full-time Pay: From ₹1,000,000.00 per year Location Type: In-person Schedule: Monday to Friday Ability to commute/relocate: Hyderabad, Telangana: Reliably commute or planning to relocate before starting work (Required) Application Question(s): What is your current CTC? What is your expected CTC? What is your current location? What is your notice period/ LWD? Are you comfortable attending L2 F2F interview in Hyderabad? Experience: Informatica powercenter: 5 years (Required) total work: 6 years (Required) Work Location: In person

Posted 2 days ago

Apply

3.0 - 6.0 years

0 Lacs

Hyderābād

Remote

GlassDoor logo

Location : Hyderabad, India (Hybrid) This is a hybrid position primarily based in Hyderabad, India. We’re committed to your flexibility and wellbeing and our hybrid strategy currently requires three days a week in the office, giving you the option to work remotely for some of your working week. Find out more about our culture of flexible working . We give you a world of potential. Support is awesome in the way trust makes it work! When you join this dynamic team as a Software Engineer, you will enjoy a career, teamwork, flexibility, and leadership you can trust to help accelerate your personal and professional goals. Come be a part of a world of potential at Computershare Business Support Services. Corporate Trust is a market leader with decades of experience as a provider of trustee and sophisticated agency services for private and public companies, investment bankers, asset managers as well as governments and institutions. We offer a wide range of services that fulfil our clients with a best-in-class reputation built on our high-touch approach to client service we are looking for people to join us and be a part of our exciting future as one of the top corporate trust firms globally. A key part of this role will be collaborating with our onshore teams to service our Corporate Trust business lines and help us to deliver the professional services our clients trust and depend on. If you’re a match to those skills and have the passionate drive to be part of something truly amazing, while working on a diverse team and have the willingness to learn multiple tasks, then this is the perfect opportunity for you! A role you will love This role will work within an Agile environment to develop and support applications across the Computershare portfolio. This role will lead moderately complex initiatives and deliverables within technical domain. This role will work within cross-functional teams, this role requires strong technical skills, curiosity, a passion for delivering quality solutions and the drive to continually improve the quality and speed with which we deliver value to the business. This role will resolve moderately complex issues ad lead a team to meet existing client needs and/or potential clients needs while leveraging solid understanding of function, policies, procedures or compliance requirements. In Technology Services (CTS) we partner with our global businesses, providing technology services and IT support, designing, and developing new products to support our clients, customers, and employees. These business-aligned CIO teams leverage the expertise and capacity of enterprise-wide teams, such as the Digital Foundry, the Global Development team and many of our CTO teams. To continually improve our capabilities and speed to market, we have our own innovation, product management and manufacture practices and frameworks which are regularly refined. We ensure that colleagues around the world have access to the technology and agreed service levels that they need to take care of their clients and their clients’ shareholders, employees, and customers. Some of your key responsibilities will include: Apply knowledge of standards, policies, best practice and organizational structure so that you can work both independently and collaboratively within your team and with key stakeholders. Provide informal guidance and share knowledge with colleagues to enable them to contribute to the team’s objectives. Ensure the quality of tasks, services and information provided by your team – through the quality of your own work and the support you provide to others - to ensure that your team delivers high-quality, maintainable software which adheres to internal standards and policies Support the evaluation and resolution of technical challenges and blockers to minimize their impact on the team’s delivery and/or supported products. Identify and support improvements and innovation in technologies/practices within your team that would benefit the business e.g. efficiency in the software development process or improved customer experience. What will you bring to the role? We are a global business with an entrepreneurial spirit, and we are proud of that. What that comes with this is a fast-paced environment and lots of change so you will be resilient in nature and able to adapt quickly and embrace the pace of change we often work at. We are looking for people with these skills. Required overall – 3-6 years of exp in SSRS Development. Hands on experience in Database( oracle /SQL server) , Reporting Services (SSRS) Design and develop Oracle / SQL Server stored procedures, functions, views and triggers to be used during the ETL process Person should have core knowledge in SSRS reports and SQL server with creating complex stored procedures. He also should know crystal reports because to convert reports from crystal to SSRS. Should be very strong in Writing and creating SSRS repots and work independently. Should be very well versed with integrating Oracle and SSRS. Designing and developing SSIS / SQL ETL solutions to acquire and prepare data from numerous upstream systems for processing would be good to have skills Understands how to convert Crystal Reports to SSRS Debug and tune SSRS and suggests improvements Able to write and maintain database objects (tables, views, indexes, Proficient with MS SQL Server. Should be able to write queries and complex Stored Procedures with a keen eye for finding issues Test and prepare ETL processes for deployment to production and non-production environments Support system and acceptance testing including the development or refinement of test plans Good understanding of test automation Having exposure to PowerBI would be an added advanatage Collaborates and communicates well, builds great working relationships, influences others, challenges effectively and responds well to challenge from others, shares information and ideas with others, has good listening skills. Has a strong work ethic and is able to deal with sometimes conflicting priorities. Curious and continuous learner – investigates, interprets and grasps new concepts. Self-motivated and can use own initiative to work with limited guidance to implement innovative solutions. Pays attention to detail, finds root cause and takes a rigorous approach to problem solving. Rewards designed for you Health and wellbeing rewards that can be tailored to support you and your family. Save for your future. We will support you along your retirement savings journey. Paid parental leave , flexible working and a caring and inclusive culture. Income protection . To ease concerns when the unexpected occurs our package includes short and long-term disability benefits, life insurance, supplemental life insurance (single/spouse/family) and more. And more . Ours is a welcoming and close-knit community, with experienced colleagues ready to help you grow. Our careers hub will help you find out more about our rewards and life at Computershare, visit computershare.com/careershub . LI#DNP

Posted 2 days ago

Apply

15.0 years

0 Lacs

Hyderābād

On-site

GlassDoor logo

Project Role : Application Developer Project Role Description : Design, build and configure applications to meet business process and application requirements. Must have skills : Ab Initio Good to have skills : NA Minimum 3 year(s) of experience is required Educational Qualification : 15 years full time education Summary: As an Application Developer, you will design, build, and configure applications to meet business process and application requirements. A typical day involves collaborating with team members to understand project needs, developing application features, and ensuring that the applications function seamlessly to support business operations. You will engage in problem-solving discussions and contribute innovative ideas to enhance application performance and user experience, all while adhering to project timelines and quality standards. Roles & Responsibilities: - Expected to perform independently and become an SME. - Required active participation/contribution in team discussions. - Contribute in providing solutions to work related problems. - Assist in the documentation of application processes and workflows. - Engage in code reviews to ensure best practices and quality standards are maintained. Professional & Technical Skills: - Must To Have Skills: Proficiency in Ab Initio. - Strong understanding of data integration and ETL processes. - Experience with application development lifecycle methodologies. - Familiarity with database management systems and SQL. - Ability to troubleshoot and resolve application issues efficiently. Additional Information: - The candidate should have minimum 3 years of experience in Ab Initio. - This position is based at our Hyderabad office. - A 15 years full time education is required. 15 years full time education

Posted 2 days ago

Apply

7.0 - 8.0 years

4 - 7 Lacs

Hyderābād

On-site

GlassDoor logo

Location: Hyderabad, IN Employment type: Employee Place of work: Office Offshore/Onshore: Onshore TechnipFMC is committed to driving real change in the energy industry. Our ambition is to build a sustainable future through relentless innovation and global collaboration – and we want you to be part of it. You’ll be joining a culture that values curiosity, expertise, and ideas as well as diversity, inclusion, and authenticity. Bring your unique energy to our team of more than 20,000 people worldwide, and discover a rewarding, fulfilling, and varied career that you can take in anywhere you want to go. Job Purpose Data Analyst plays a crucial lead role in managing and optimizing business intelligence solutions using Power BI. Job Description Leadership and Strategy: Lead the design, development, and deployment of Power BI reports and dashboards. Provide strategic direction for data visualization and business intelligence initiatives. Interface with Business Owner, Project Manager, Planning Manager, Resource Managers etc. Develop roadmap for execution of complex data analytics projects. Data Modeling and Integration: Develop complex data models, establish relationships, and ensure data integrity. Oversee data integration from various sources. Advanced Analytics: Perform advanced data analysis using DAX (Data Analysis Expressions) and other analytical tools to derive insights and support decision-making. Collaboration: Work closely with stakeholders to gather requirements, define data needs, and ensure the delivery of high-quality BI solutions. Performance Optimization: Optimize solutions for performance, ensuring efficient data processing and report rendering. Mentorship: Mentor and guide junior developers, providing technical support and best practices for Power BI development. Data Security: Implement and maintain data security measures, ensuring compliance with data protection regulations. Demonstrated experience of leading complex projects with a team of varied experience levels. You are meant for this job if: Educational Background: Bachelor’s or Master’s degree in Computer Science, Information Systems, or a related field. Experience in working with unstructured data and data integration. Technical Skills: Proficiency in Power BI, DAX, SQL, and data modeling, exposure to data engineering. Experience with data integration tools and ETL processes. Hands-on experience with Snowflake Experience: 7-8 years of experience in business intelligence and data analytics, with a focus on Power BI. Soft Skills: Strong analytical and problem-solving skills, excellent communication abilities, and the capacity to lead and collaborate with global cross-functional teams. Skills Change Leadership Process Mapping Being a global leader in the energy industry requires an inclusive and diverse environment. TechnipFMC promotes diversity, equity, and inclusion by ensuring equal opportunities to all ages, races, ethnicities, religions, sexual orientations, gender expressions, disabilities, or all other pluralities. We celebrate who you are and what you bring. Every voice matters and we encourage you to add to our culture. TechnipFMC respects the rights and dignity of those it works with and promotes adherence to internationally recognized human rights principles for those in its value chain. Date posted: Jun 16, 2025 Requisition number: 13774

Posted 2 days ago

Apply

5.0 years

0 Lacs

Gurgaon

On-site

GlassDoor logo

Job Description Alimentation Couche-Tard Inc., (ACT) is a global Fortune 200 company. A leader in the convenience store and fuel space with over 17,000 stores in 31 countries, serving more than 6 million customers each day It is an exciting time to be a part of the growing Data Engineering team at Circle K. We are driving a well-supported cloud-first strategy to unlock the power of data across the company and help teams to discover, value and act on insights from data across the globe. With our strong data pipeline, this position will play a key role partnering with our Technical Development stakeholders to enable analytics for long term success. About the role We are looking for a Senior Data Engineer with a collaborative, “can-do” attitude who is committed & strives with determination and motivation to make their team successful. A Sr. Data Engineer who has experience architecting and implementing technical solutions as part of a greater data transformation strategy. This role is responsible for hands on sourcing, manipulation, and delivery of data from enterprise business systems to data lake and data warehouse. This role will help drive Circle K’s next phase in the digital journey by modeling and transforming data to achieve actionable business outcomes. The Sr. Data Engineer will create, troubleshoot and support ETL pipelines and the cloud infrastructure involved in the process, will be able to support the visualizations team. Roles and Responsibilities Collaborate with business stakeholders and other technical team members to acquire and migrate data sources that are most relevant to business needs and goals. Demonstrate deep technical and domain knowledge of relational and non-relation databases, Data Warehouses, Data lakes among other structured and unstructured storage options. Determine solutions that are best suited to develop a pipeline for a particular data source. Develop data flow pipelines to extract, transform, and load data from various data sources in various forms, including custom ETL pipelines that enable model and product development. Efficient in ETL/ELT development using Azure cloud services and Snowflake, Testing and operation/support process (RCA of production issues, Code/Data Fix Strategy, Monitoring and maintenance). Work with modern data platforms including Snowflake to develop, test, and operationalize data pipelines for scalable analytics delivery. Provide clear documentation for delivered solutions and processes, integrating documentation with the appropriate corporate stakeholders. Identify and implement internal process improvements for data management (automating manual processes, optimizing data delivery, re-designing infrastructure for greater scalability). Stay current with and adopt new tools and applications to ensure high quality and efficient solutions. Build cross-platform data strategy to aggregate multiple sources and process development datasets. Proactive in stakeholder communication, mentor/guide junior resources by doing regular KT/reverse KT and help them in identifying production bugs/issues if needed and provide resolution recommendation. Job Requirements Bachelor’s Degree in Computer Engineering, Computer Science or related discipline, Master’s Degree preferred. 5+ years of ETL design, development, and performance tuning using ETL tools such as SSIS/ADF in a multi-dimensional Data Warehousing environment. 5+ years of experience with setting up and operating data pipelines using Python or SQL 5+ years of advanced SQL Programming: PL/SQL, T-SQL 5+ years of experience working with Snowflake, including Snowflake SQL, data modeling, and performance optimization. Strong hands-on experience with cloud data platforms such as Azure Synapse and Snowflake for building data pipelines and analytics workloads. 5+ years of strong and extensive hands-on experience in Azure, preferably data heavy / analytics applications leveraging relational and NoSQL databases, Data Warehouse and Big Data. 5+ years of experience with Azure Data Factory, Azure Synapse Analytics, Azure Analysis Services, Azure Databricks, Blob Storage, Databricks/Spark, Azure SQL DW/Synapse, and Azure functions. 5+ years of experience in defining and enabling data quality standards for auditing, and monitoring. Strong analytical abilities and a strong intellectual curiosity In-depth knowledge of relational database design, data warehousing and dimensional data modeling concepts Understanding of REST and good API design. Experience working with Apache Iceberg, Delta tables and distributed computing frameworks Strong collaboration and teamwork skills & excellent written and verbal communications skills. Self-starter and motivated with ability to work in a fast-paced development environment. Agile experience highly desirable. Proficiency in the development environment, including IDE, database server, GIT, Continuous Integration, unit-testing tool, and defect management tools. Knowledge Strong Knowledge of Data Engineering concepts (Data pipelines creation, Data Warehousing, Data Marts/Cubes, Data Reconciliation and Audit, Data Management). Strong working knowledge of Snowflake, including warehouse management, Snowflake SQL, and data sharing techniques. Experience building pipelines that source from or deliver data into Snowflake in combination with tools like ADF and Databricks. Working Knowledge of Dev-Ops processes (CI/CD), Git/Jenkins version control tool, Master Data Management (MDM) and Data Quality tools. Strong Experience in ETL/ELT development, QA and operation/support process (RCA of production issues, Code/Data Fix Strategy, Monitoring and maintenance). Hands on experience in Databases like (Azure SQL DB, MySQL/, Cosmos DB etc.), File system (Blob Storage), Python/Unix shell Scripting. ADF, Databricks and Azure certification is a plus. Technologies we use: Databricks, Azure SQL DW/Synapse, Azure Tabular, Azure Data Factory, Azure Functions, Azure Containers, Docker, DevOps, Python, PySpark, Scripting (Powershell, Bash), Git, Terraform, Power BI, Snowflake #LI-DS1

Posted 2 days ago

Apply

6.0 years

0 Lacs

Gurgaon

On-site

GlassDoor logo

About Gartner IT : Join a world-class team of skilled engineers who build creative digital solutions to support our colleagues and clients. We make a broad organizational impact by delivering cutting-edge technology solutions that power Gartner. Gartner IT values its culture of nonstop innovation, an outcome-driven approach to success, and the notion that great ideas can come from anyone on the team. About this role: Senior Software Engineer will provide technical expertise in designing and building Master Data Management solutions or other Chief Data Office initiatives to meet the shifting organizational demands. This role will be responsible for building Master Data Management solution in Ataccama MDM platform to meet the shifting organizational demands. You will be part of the CDO execution Team to work on MDM program or warehouse. MDM brings data from multiple sources and enriches the information using validation/standardization and dedupe process. MDM is a centralized hub for contact and account domain across Gartner which standardizes and enriches information and shares across multiple systems within Gartner. Enrichment also includes to fetch latest and greatest data from multiple vendors and sharing information across systems within Gartner. What you’ll do: Responsible for reviewing and analysis of business requirements and design technical mapping document Build new processes in Ataccama Build new ETL jobs Help build defining best practices & processes Collaboration on Master Data Management, architecture and technical design discussions Build new ETL using Azure Data Factory and Synapse Perform and participate in code reviews, peer inspections and technical design and specifications, as well as document and review detailed designs Provide status reports to the higher management Maintain Service Levels and department goals for problem resolution. What you’ll need: Strong IT professional with 6+ years of experience in ETL, Master data Management solutions and Database Operations. The candidate should have strong analytical and problem-solving skills. Must have: Experience in Database Operations with a bachelor’s degree (Computer Science preferred). Understanding of data modelling Hands-on experience in MDM implementation using tools (Customer domain, product domain etc.) Ataccama preferred. Experience in ETL technology Experience in PL/SQL Experience in PostgreSQL and cloud databases Good exposure writing complex SQL Hands-on experience with Synapse Good exposure writing complex SQL Commitment to teamwork as a contributor Nice to have: Good knowledge in cloud technology and exposure in cloud tools Good understanding of business process and analyzing underlying data Experience with Python/Java programming language Experience with Synapse Experience with an Agile Methodology like Scrum Who are you: Bachelor’s degree or foreign equivalent degree in Computer Science or a related field required Excellent communication skills. Able to work independently or within a team proactively in a fast-paced AGILE-SCRUM environment. Owns success – Takes responsibility for the successful delivery of the solutions. Strong desire to improve upon their skills in tools and technologies Don’t meet every single requirement? We encourage you to apply anyway. You might just be the right candidate for this, or other roles. #LI-NS4 Who are we? At Gartner, Inc. (NYSE:IT), we guide the leaders who shape the world. Our mission relies on expert analysis and bold ideas to deliver actionable, objective insight, helping enterprise leaders and their teams succeed with their mission-critical priorities. Since our founding in 1979, we’ve grown to more than 21,000 associates globally who support ~14,000 client enterprises in ~90 countries and territories. We do important, interesting and substantive work that matters. That’s why we hire associates with the intellectual curiosity, energy and drive to want to make a difference. The bar is unapologetically high. So is the impact you can have here. What makes Gartner a great place to work? Our sustained success creates limitless opportunities for you to grow professionally and flourish personally. We have a vast, virtually untapped market potential ahead of us, providing you with an exciting trajectory long into the future. How far you go is driven by your passion and performance. We hire remarkable people who collaborate and win as a team. Together, our singular, unifying goal is to deliver results for our clients. Our teams are inclusive and composed of individuals from different geographies, cultures, religions, ethnicities, races, genders, sexual orientations, abilities and generations. We invest in great leaders who bring out the best in you and the company, enabling us to multiply our impact and results. This is why, year after year, we are recognized worldwide as a great place to work . What do we offer? Gartner offers world-class benefits, highly competitive compensation and disproportionate rewards for top performers. In our hybrid work environment, we provide the flexibility and support for you to thrive — working virtually when it's productive to do so and getting together with colleagues in a vibrant community that is purposeful, engaging and inspiring. Ready to grow your career with Gartner? Join us. The policy of Gartner is to provide equal employment opportunities to all applicants and employees without regard to race, color, creed, religion, sex, sexual orientation, gender identity, marital status, citizenship status, age, national origin, ancestry, disability, veteran status, or any other legally protected status and to seek to advance the principles of equal employment opportunity. Gartner is committed to being an Equal Opportunity Employer and offers opportunities to all job seekers, including job seekers with disabilities. If you are a qualified individual with a disability or a disabled veteran, you may request a reasonable accommodation if you are unable or limited in your ability to use or access the Company’s career webpage as a result of your disability. You may request reasonable accommodations by calling Human Resources at +1 (203) 964-0096 or by sending an email to ApplicantAccommodations@gartner.com . Job Requisition ID:101125 By submitting your information and application, you confirm that you have read and agree to the country or regional recruitment notice linked below applicable to your place of residence. Gartner Applicant Privacy Link: https://jobs.gartner.com/applicant-privacy-policy For efficient navigation through the application, please only use the back button within the application, not the back arrow within your browser.

Posted 2 days ago

Apply

7.0 - 9.0 years

0 Lacs

Gurgaon

On-site

GlassDoor logo

Role Description: As a Technical Lead - Cloud Data Platform (AWS) at Incedo, you will be responsible for designing, deploying and maintaining cloud-based data platforms on the AWS platform. You will work with data engineers, data scientists and business analysts to understand business requirements and design scalable, reliable and cost effective solutions that meet those requirements. Roles & Responsibilities: Designing, developing and deploying cloud-based data platforms using Amazon Web Services (AWS) Integrating and processing large amounts of structured and unstructured data from various sources Implementing and optimizing ETL processes and data pipelines Developing and maintaining security and access controls Collaborating with other teams to ensure the consistency and integrity of data Troubleshooting and resolving data platform issues Technical Skills Skills Requirements: In-depth knowledge of AWS services and tools such as AWS Glue, AWS Redshift, and AWS Lambda Experience in building scalable and reliable data pipelines using AWS services, Apache Spark, and related big data technologies Familiarity with cloud-based infrastructure and deployment, specifically on AWS Strong knowledge of programming languages such as Python, Java, and SQL Must have excellent communication skills and be able to communicate complex technical information to non-technical stakeholders in a clear and concise manner. Must understand the company's long-term vision and align with it. Should be open to new ideas and be willing to learn and develop new skills. Should also be able to work well under pressure and manage multiple tasks and priorities Qualifications Qualifications 7-9 years of work experience in relevant field B.Tech/B.E/M.Tech or MCA degree from a reputed university. Computer science background is preferred Job Types: Full-time, Permanent Schedule: Day shift Morning shift Work Location: In person

Posted 2 days ago

Apply

5.0 years

4 - 8 Lacs

Gurgaon

On-site

GlassDoor logo

Job details Employment Type: Full-Time Location: Gurgaon, Sector, India Job Category: Innovation & Technology Job Number: WD30242868 Job Description Who we are? Johnson Controls is the global leader for smart, healthy and sustainable buildings. At Johnson Controls, we’ve been making buildings smarter since 1885, and our capabilities, depth of innovation experience, and global reach have been growing ever since. Today, we offer the world’s largest portfolio of building products, technologies, software, and services; we put that portfolio to work to transform the environments where people live, work, learn and play. This is where Johnson Controls comes in, helping drive the outcomes that matter most. Through a full range of systems and digital solutions, we make your buildings smarter. A smarter building is safer, more comfortable, more efficient, and, ultimately, more sustainable. Most important, smarter buildings let you focus more intensely on your unique mission. Better for your people. Better for your bottom line. Better for the planet. We’re helping to create a healthy planet with solutions that decrease energy use, reduce waste and make carbon neutrality a reality. Sustainability is a top priority for our company. We committed to invest 75 percent of new product development R&D in climate-related innovation to develop sustainable products and services. We take sustainability seriously. Achieving net zero carbon emissions before 2040 is just one of our commitments to making the world a better place. Please visit and follow Johnson Controls LinkedIn for recent exciting activities. Why JCI: https://www.youtube.com/watch?v=nrbigjbpxkg Asia-Pacific LinkedIn: https://www.linkedin.com/showcase/johnson-controls-asia-pacific/posts/?feedView=all Career: The Power Behind Your Mission OpenBlue: This is How a Space Comes Alive How will you do it? Solution Architecture Design: Design scalable and efficient data architectures using Snowflake that meet business needs and best practices Implementation: Lead the deployment of Snowflake solutions, including data ingestion, transformation, and visualization processes Data Governance & Security: Ensuring compliance with global data regulations in accordance with the data strategy and cybersecurity initiatives Collaboration: Work closely with data engineers, data scientists, and business stakeholders to gather requirements and provide technical guidance Optimization: Monitor and optimize performance, storage, and cost of Snowflake environments, implementing best practices for data modeling and querying Integration: Integrate Snowflake with other cloud services and tools (e.g., ETL/ELT tools, BI tools, data lakes) to create seamless data workflows Documentation: Create and maintain documentation for architecture designs, data models, and operational procedures Training and Support: Provide training and support to teams on Snowflake usage and best practices Troubleshooting: Identify and resolve issues related to Snowflake performance, security, and data integrity Stay Updated: Keep abreast of Snowflake updates, new features, and industry trends to continually enhance solutions and methodologies Assist Data Architects in implementing Snowflake-based data warehouse solutions to support advanced analytics and reporting use cases What we look for? Minimum: Bachelor’s / Postgraduate/ Master’s Degree in any stream Minimum 5 years of relevant experience as Solutions Architect, Data Architect, or similar role Knowledge of Snowflake Data warehouse and understanding the concepts of data warehousing including ELT, ETL processes and data modelling Understanding of cloud platforms (AWS, Azure, GCP) and their integration with Snowflake Competency in data preparation and/or ETL tools to build and maintain data pipelines and flows Strong knowledge of databases, stored procedures(SPs) and optimization of large data sets SQL, Power BI/Tableau is mandatory along with knowledge of any data integration tool Excellent communication and collaboration skills Strong problem-solving abilities and analytical mindset Ability to work in a fast-paced, dynamic environment What we offer: We offer an exciting and challenging position. Joining us you will become part of a leading global multi-industrial corporation defined by its stimulating work environment and job satisfaction. In addition, we offer outstanding career development opportunities which will stretch your abilities and channel your talents Diversity & Inclusion Our dedication to diversity and inclusion starts with our values. We lead with integrity and purpose, focusing on the future and aligning with our customers’ vision for success. Our High-Performance Culture ensures that we have the best talent that is highly engaged and eager to innovate. Our D&I mission elevates each employee’s responsibility to contribute to our culture. It’s through these contributions that we’ll drive the mindsets and behaviors we need to power our customers’ missions. You have the power. You have the voice. You have the culture in your hands

Posted 2 days ago

Apply

8.0 years

0 Lacs

Gurgaon

On-site

GlassDoor logo

Project description We are seeking an experienced Senior Project Manager with a strong background in delivering data engineering and Python-based development projects. In this role, you will manage cross-functional teams and lead Agile delivery for high-impact, cloud-based data initiatives. You'll work closely with data engineers, scientists, architects, and business stakeholders to ensure projects are delivered on time, within scope, and aligned with strategic objectives. The ideal candidate combines technical fluency, strong leadership, and Agile delivery expertise in data-centric environments. Responsibilities Lead and manage data engineering and Python-based development projects, ensuring timely delivery and alignment with business goals. Work closely with data engineers, data scientists, architects, and product owners to gather requirements and define project scope. Translate complex technical requirements into actionable project plans and user stories. Oversee sprint planning, backlog grooming, daily stand-ups, and retrospectives in Agile/Scrum environments. Ensure best practices in Python coding, data pipeline design, and cloud-based data architecture are followed. Identify and mitigate risks, manage dependencies, and escalate issues when needed. Own stakeholder communications, reporting, and documentation of all project artifacts. Track KPIs and delivery metrics to ensure accountability and continuous improvement. Skills Must have Experience: Minimum 8+ years of project management experience, including 3+ years managing data and Python-based development projects. Agile Expertise: Strong experience delivering projects in Agile/Scrum environments with distributed or hybrid teams. Technical Fluency: Solid understanding of Python, data pipelines, and ETL/ELT workflows. Familiarity with cloud platforms such as AWS, Azure, or GCP. Exposure to tools like Airflow, dbt, Spark, Databricks, or Snowflake is a plus. Tools: Proficiency with JIRA, Confluence, Git, and project dashboards (e.g., Power BI, Tableau). Soft Skills: Strong communication, stakeholder management, and leadership skills. Ability to translate between technical and non-technical audiences. Skilled in risk management, prioritization, and delivery tracking. Nice to have N/A Other Languages English: C1 Advanced Seniority Senior Gurugram, India Req. VR-115111 Technical Project Management BCM Industry 16/06/2025 Req. VR-115111

Posted 2 days ago

Apply

8.0 years

3 - 8 Lacs

Gurgaon

On-site

GlassDoor logo

Date: Jun 5, 2025 Job Requisition Id: 61535 Location: Gurgaon, IN YASH Technologies is a leading technology integrator specializing in helping clients reimagine operating models, enhance competitiveness, optimize costs, foster exceptional stakeholder experiences, and drive business transformation. At YASH, we’re a cluster of the brightest stars working with cutting-edge technologies. Our purpose is anchored in a single truth – bringing real positive changes in an increasingly virtual world and it drives us beyond generational gaps and disruptions of the future. We are looking forward to hire Microsoft Fabric Professionals in the following areas : Experience 8+ Years Job Description Position: Data Analytics Lead. Experience: 8+ Years. Responsibilities: Build, manage, and foster a high-functioning team of data engineers and Data analysts. Collaborate with business and technical teams to capture and prioritize platform ingestion requirements. Experience of working with manufacturing industry in building a centralized data platform for self service reporting. Lead the data analytics team members, providing guidance, mentorship, and support to ensure their professional growth and success. Responsible for managing customer, partner, and internal data on the cloud and on-premises. Evaluate and understand current data technologies and trends and promote a culture of learning. Build and end to end data strategy from collecting the requirements from business to modelling the data and building reports and dashboards Required Skills: Experience in data engineering and architecture, with a focus on developing scalable cloud solutions in Azure Synapse / Microsoft Fabric / Azure Databricks Accountable for the data group’s activities including architecting, developing, and maintaining a centralized data platform including our operational data, data warehouse, data lake, Data factory pipelines, and data-related services. Experience in designing and building operationally efficient pipelines, utilising core Azure components, such as Azure Data Factory, Azure Databricks and Pyspark etc Strong understanding of data architecture, data modelling, and ETL processes. Proficiency in SQL and Pyspark Strong knowledge of building PowerBI reports and dashboards. Excellent communication skills Strong problem-solving and analytical skills. Required Technical/ Functional Competencies Domain/ Industry Knowledge: Basic knowledge of customer's business processes- relevant technology platform or product. Able to prepare process maps, workflows, business cases and simple business models in line with customer requirements with assistance from SME and apply industry standards/ practices in implementation with guidance from experienced team members. Requirement Gathering and Analysis: Working knowledge of requirement management processes and requirement analysis processes, tools & methodologies. Able to analyse the impact of change requested/ enhancement/ defect fix and identify dependencies or interrelationships among requirements & transition requirements for engagement. Product/ Technology Knowledge: Working knowledge of technology product/platform standards and specifications. Able to implement code or configure/customize products and provide inputs in design and architecture adhering to industry standards/ practices in implementation. Analyze various frameworks/tools, review the code and provide feedback on improvement opportunities. Architecture tools and frameworks: Working knowledge of architecture Industry tools & frameworks. Able to identify pros/ cons of available tools & frameworks in market and use those as per Customer requirement and explore new tools/ framework for implementation. Architecture concepts and principles : Working knowledge of architectural elements, SDLC, methodologies. Able to provides architectural design/ documentation at an application or function capability level and implement architectural patterns in solution & engagements and communicates architecture direction to the business. Analytics Solution Design: Knowledge of statistical & machine learning techniques like classification, linear regression modelling, clustering & decision trees. Able to identify the cause of errors and their potential solutions. Tools & Platform Knowledge: Familiar with wide range of mainstream commercial & open-source data science/analytics software tools, their constraints, advantages, disadvantages, and areas of application. Required Behavioral Competencies Accountability: Takes responsibility for and ensures accuracy of own work, as well as the work and deadlines of the team. Collaboration: Shares information within team, participates in team activities, asks questions to understand other points of view. Agility: Demonstrates readiness for change, asking questions and determining how changes could impact own work. Customer Focus: Identifies trends and patterns emerging from customer preferences and works towards customizing/ refining existing services to exceed customer needs and expectations. Communication: Targets communications for the appropriate audience, clearly articulating and presenting his/her position or decision. Drives Results: Sets realistic stretch goals for self & others to achieve and exceed defined goals/targets. Resolves Conflict: Displays sensitivity in interactions and strives to understand others’ views and concerns. Certifications Mandatory At YASH, you are empowered to create a career that will take you to where you want to go while working in an inclusive team environment. We leverage career-oriented skilling models and optimize our collective intelligence aided with technology for continuous learning, unlearning, and relearning at a rapid pace and scale. Our Hyperlearning workplace is grounded upon four principles Flexible work arrangements, Free spirit, and emotional positivity Agile self-determination, trust, transparency, and open collaboration All Support needed for the realization of business goals, Stable employment with a great atmosphere and ethical corporate culture

Posted 2 days ago

Apply

7.0 years

7 - 7 Lacs

Gurgaon

On-site

GlassDoor logo

Engineer III, Database Engineering Gurgaon, India; Hyderabad, India Information Technology 316332 Job Description About The Role: Grade Level (for internal use): 10 Role: As a Senior Database Engineer, you will work on multiple datasets that will enable S&P CapitalIQ Pro to serve-up value-added Ratings, Research and related information to the Institutional clients. The Team: Our team is responsible for the gathering data from multiple sources spread across the globe using different mechanism (ETL/GG/SQL Rep/Informatica/Data Pipeline) and convert them to a common format which can be used by Client facing UI tools and other Data providing Applications. This application is the backbone of many of S&P applications and is critical to our client needs. You will get to work on wide range of technologies and tools like Oracle/SQL/.Net/Informatica/Kafka/Sonic. You will have the opportunity every day to work with people from a wide variety of backgrounds and will be able to develop a close team dynamic with coworkers from around the globe. We craft strategic implementations by using the broader capacity of the data and product. Do you want to be part of a team that executes cross-business solutions within S&P Global? Impact: Our Team is responsible to deliver essential and business critical data with applied intelligence to power the market of the future. This enables our customer to make decisions with conviction. Contribute significantly to the growth of the firm by- Developing innovative functionality in existing and new products Supporting and maintaining high revenue productionized products Achieve the above intelligently and economically using best practices Career: This is the place to hone your existing Database skills while having the chance to become exposed to fresh technologies. As an experienced member of the team, you will have the opportunity to mentor and coach developers who have recently graduated and collaborate with developers, business analysts and product managers who are experts in their domain. Your skills: You should be able to demonstrate that you have an outstanding knowledge and hands-on experience in the below areas: Complete SDLC: architecture, design, development and support of tech solutions Play a key role in the development team to build high-quality, high-performance, scalable code Engineer components, and common services based on standard corporate development models, languages and tools Produce technical design documents and conduct technical walkthroughs Collaborate effectively with technical and non-technical stakeholders Be part of a culture to continuously improve the technical design and code base Document and demonstrate solutions using technical design docs, diagrams and stubbed code Our Hiring Manager says: I’m looking for a person that gets excited about technology and motivated by seeing how our individual contribution and team work to the world class web products affect the workflow of thousands of clients resulting in revenue for the company. Qualifications Required: Bachelor’s degree in computer science, Information Systems or Engineering. 7+ years of experience on Transactional Databases like SQL server, Oracle, PostgreSQL and other NoSQL databases like Amazon DynamoDB, MongoDB Strong Database development skills on SQL Server, Oracle Strong knowledge of Database architecture, Data Modeling and Data warehouse. Knowledge on object-oriented design, and design patterns. Familiar with various design and architectural patterns Strong development experience with Microsoft SQL Server Experience in cloud native development and AWS is a big plus Experience with Kafka/Sonic Broker messaging systems Nice to have: Experience in developing data pipelines using Java or C# is a significant advantage. Strong knowledge around ETL Tools – Informatica, SSIS Exposure with Informatica is an advantage. Familiarity with Agile and Scrum models Working Knowledge of VSTS. Working knowledge of AWS cloud is an added advantage. Understanding of fundamental design principles for building a scalable system. Understanding of financial markets and asset classes like Equity, Commodity, Fixed Income, Options, Index/Benchmarks is desirable. Additionally, experience with Scala, Python and Spark applications is a plus. About S&P Global Market Intelligence At S&P Global Market Intelligence, a division of S&P Global we understand the importance of accurate, deep and insightful information. Our team of experts delivers unrivaled insights and leading data and technology solutions, partnering with customers to expand their perspective, operate with confidence, and make decisions with conviction. For more information, visit www.spglobal.com/marketintelligence. What’s In It For You? Our Purpose: Progress is not a self-starter. It requires a catalyst to be set in motion. Information, imagination, people, technology–the right combination can unlock possibility and change the world. Our world is in transition and getting more complex by the day. We push past expected observations and seek out new levels of understanding so that we can help companies, governments and individuals make an impact on tomorrow. At S&P Global we transform data into Essential Intelligence®, pinpointing risks and opening possibilities. We Accelerate Progress. Our People: We're more than 35,000 strong worldwide—so we're able to understand nuances while having a broad perspective. Our team is driven by curiosity and a shared belief that Essential Intelligence can help build a more prosperous future for us all. From finding new ways to measure sustainability to analyzing energy transition across the supply chain to building workflow solutions that make it easy to tap into insight and apply it. We are changing the way people see things and empowering them to make an impact on the world we live in. We’re committed to a more equitable future and to helping our customers find new, sustainable ways of doing business. We’re constantly seeking new solutions that have progress in mind. Join us and help create the critical insights that truly make a difference. Our Values: Integrity, Discovery, Partnership At S&P Global, we focus on Powering Global Markets. Throughout our history, the world's leading organizations have relied on us for the Essential Intelligence they need to make confident decisions about the road ahead. We start with a foundation of integrity in all we do, bring a spirit of discovery to our work, and collaborate in close partnership with each other and our customers to achieve shared goals. Benefits: We take care of you, so you can take care of business. We care about our people. That’s why we provide everything you—and your career—need to thrive at S&P Global. Our benefits include: Health & Wellness: Health care coverage designed for the mind and body. Flexible Downtime: Generous time off helps keep you energized for your time on. Continuous Learning: Access a wealth of resources to grow your career and learn valuable new skills. Invest in Your Future: Secure your financial future through competitive pay, retirement planning, a continuing education program with a company-matched student loan contribution, and financial wellness programs. Family Friendly Perks: It’s not just about you. S&P Global has perks for your partners and little ones, too, with some best-in class benefits for families. Beyond the Basics: From retail discounts to referral incentive awards—small perks can make a big difference. For more information on benefits by country visit: https://spgbenefits.com/benefit-summaries Global Hiring and Opportunity at S&P Global: At S&P Global, we are committed to fostering a connected and engaged workplace where all individuals have access to opportunities based on their skills, experience, and contributions. Our hiring practices emphasize fairness, transparency, and merit, ensuring that we attract and retain top talent. By valuing different perspectives and promoting a culture of respect and collaboration, we drive innovation and power global markets. Recruitment Fraud Alert: If you receive an email from a spglobalind.com domain or any other regionally based domains, it is a scam and should be reported to reportfraud@spglobal.com. S&P Global never requires any candidate to pay money for job applications, interviews, offer letters, “pre-employment training” or for equipment/delivery of equipment. Stay informed and protect yourself from recruitment fraud by reviewing our guidelines, fraudulent domains, and how to report suspicious activity here. - Equal Opportunity Employer S&P Global is an equal opportunity employer and all qualified candidates will receive consideration for employment without regard to race/ethnicity, color, religion, sex, sexual orientation, gender identity, national origin, age, disability, marital status, military veteran status, unemployment status, or any other status protected by law. Only electronic job submissions will be considered for employment. If you need an accommodation during the application process due to a disability, please send an email to: EEO.Compliance@spglobal.com and your request will be forwarded to the appropriate person. US Candidates Only: The EEO is the Law Poster http://www.dol.gov/ofccp/regs/compliance/posters/pdf/eeopost.pdf describes discrimination protections under federal law. Pay Transparency Nondiscrimination Provision - https://www.dol.gov/sites/dolgov/files/ofccp/pdf/pay-transp_%20English_formattedESQA508c.pdf - 20 - Professional (EEO-2 Job Categories-United States of America), IFTECH202.1 - Middle Professional Tier I (EEO Job Group) Job ID: 316332 Posted On: 2025-06-16 Location: Gurgaon, Haryana, India

Posted 2 days ago

Apply

3.0 - 5.0 years

6 - 13 Lacs

Gurgaon

On-site

GlassDoor logo

Role: Data Engineer Experience: 3–5 Years Location: Gurgaon - Onsite Notice Period: Immediate Key Skills Required Python Apache Spark Databricks Machine Learning (basic to intermediate understanding) ETL/Data Pipelines SQL (nice to have) Role Overview We’re looking for a Data Engineer with 3–5 years of experience to work on building and optimizing data pipelines using Python and Spark, with hands-on experience in Databricks. The ideal candidate should also have exposure to implementing machine learning models and collaborating across teams to deliver scalable data solutions. Responsibilities Build and maintain efficient, scalable data pipelines using Python and Apache Spark. Work closely with analytics and engineering teams to develop data-driven solutions. Use Databricks for processing, analyzing, and visualizing large datasets. Apply machine learning techniques for data insights and automation. Improve performance, reliability, and quality of data infrastructure. Monitor data integrity across the entire data lifecycle. Required Qualifications Strong hands-on experience with Python and Apache Spark. Proficient in working with Databricks for engineering workflows. Good understanding of machine learning concepts and ability to implement models. Familiarity with ETL processes, data warehousing, and SQL. Strong communication and problem-solving skills. Educational Background BE/BTech/BIT/MCA/BCA or a related technical degree. Job Type: Full-time Pay: ₹600,000.00 - ₹1,300,000.00 per year Schedule: Morning shift Work Location: In person

Posted 2 days ago

Apply

0 years

0 Lacs

Pune, Maharashtra, India

On-site

Linkedin logo

We’re seeking a talented and passionate Trainer to join our dynamic team in making a remarkable impact on the future of technology. The ideal candidate should have a strong base in technological concepts and a keen interest in delivery & mentoring. The role involves delivering best-in-class training sessions, supporting curriculum development, and providing hands-on guidance to learners. Responsibilities - What You’ll Do Training Coordination, Support & Delivery Assist in scheduling and coordinating training sessions Deliver classroom-based and virtual instructor-led training (ILT) sessions on various organizational products, platforms and technology Conduct hands-on training, workshops, and exercises to reinforce learning Manage training attendance records and assessments Learner Engagement Help ensuring access of relevant resources to learners Address learner queries by creating a positive learning environment Ensure smooth learning experience throughout the learning cycle Track learner’s progress through specific assessments and exercises Prepare learners for industry-standard certifications Curriculum Development Create structured learning paths for various experience levels Develop course materials, decks, and guides for training Update training content, available in various formats, based on industry trends and technological advancements, as and when applicable Prepare learners with practical applications of product offerings’ concepts Key Skills & Experience - What We’re Looking For Technical Skills Knowledge of any of the following technologies and industry advancements: Familiarity with GenAI Landscape, Machine Learning (ML), or a related area Proficiency in Data Engineering, Apache NiFi, Flow Files, Data Integration & Flow Management, ETL, and Data Warehousing concepts Knowledge of Python, SQL and other relevant programming languages Strong expertise in LCNC development (UI/UX Principles, Java, JavaScript frameworks) Experience with APIs and microservices Fundamental understanding of Web application development Training & Mentoring Skills Prior experience in conducting product-based or technology-based training sessions Ability to simplify complex technical concepts for easy understanding Must have delivery experience – both virtual and in-class trainings Excellent articulation, collaboration and mentoring skills Content Creation Experience in content creation and editing of training videos Qualifications & Experience Bachelor/Master’s degree in Computer Science, Engineering or a related field 5+ experience in cloud-based technologies or Artificial Intelligence (AI) Experience in training or coaching in a corporate or academic environment preferred Must have MS PowerPoint knowledge, Camtasia or other video editing skills Show more Show less

Posted 2 days ago

Apply

3.0 years

0 Lacs

Gurgaon

On-site

GlassDoor logo

Job Description Alimentation Couche-Tard Inc., (ACT) is a global Fortune 200 company. A leader in the convenience store and fuel space with over 17,000 stores in 31 countries, serving more than 6 million customers each day It is an exciting time to be a part of the growing Data Engineering team at Circle K. We are driving a well-supported cloud-first strategy to unlock the power of data across the company and help teams to discover, value and act on insights from data across the globe. With our strong data pipeline, this position will play a key role partnering with our Technical Development stakeholders to enable analytics for long term success. About the role We are looking for a Data Engineer with a collaborative, “can-do” attitude who is committed & strives with determination and motivation to make their team successful. A Data Engineer who has experience implementing technical solutions as part of a greater data transformation strategy. This role is responsible for hands on sourcing, manipulation, and delivery of data from enterprise business systems to data lake and data warehouse. This role will help drive Circle K’s next phase in the digital journey by transforming data to achieve actionable business outcomes. Roles and Responsibilities Collaborate with business stakeholders and other technical team members to acquire and migrate data sources that are most relevant to business needs and goals Demonstrate technical and domain knowledge of relational and non-relational databases, Data Warehouses, Data lakes among other structured and unstructured storage options Determine solutions that are best suited to develop a pipeline for a particular data source Develop data flow pipelines to extract, transform, and load data from various data sources in various forms, including custom ETL pipelines that enable model and product development Efficient in ELT/ETL development using Azure cloud services and Snowflake, including Testing and operational support (RCA, Monitoring, Maintenance) Work with modern data platforms including Snowflake to develop, test, and operationalize data pipelines for scalable analytics deliver Provide clear documentation for delivered solutions and processes, integrating documentation with the appropriate corporate stakeholders Identify and implement internal process improvements for data management (automating manual processes, optimizing data delivery, re-designing infrastructure for greater scalability) Stay current with and adopt new tools and applications to ensure high quality and efficient solutions Build cross-platform data strategy to aggregate multiple sources and process development datasets Proactive in stakeholder communication, mentor/guide junior resources by doing regular KT/reverse KT and help them in identifying production bugs/issues if needed and provide resolution recommendation Job Requirements Bachelor’s degree in Computer Engineering, Computer Science or related discipline, Master’s Degree preferred 3+ years of ETL design, development, and performance tuning using ETL tools such as SSIS/ADF in a multi-dimensional Data Warehousing environment 3+ years of experience with setting up and operating data pipelines using Python or SQL 3+ years of advanced SQL Programming: PL/SQL, T-SQL 3+ years of experience working with Snowflake, including Snowflake SQL, data modeling, and performance optimization Strong hands-on experience with cloud data platforms such as Azure Synapse and Snowflake for building data pipelines and analytics workloads 3+ years of strong and extensive hands-on experience in Azure, preferably data heavy / analytics applications leveraging relational and NoSQL databases, Data Warehouse and Big Data 3+ years of experience with Azure Data Factory, Azure Synapse Analytics, Azure Analysis Services, Azure Databricks, Blob Storage, Databricks/Spark, Azure SQL DW/Synapse, and Azure functions 3+ years of experience in defining and enabling data quality standards for auditing, and monitoring Strong analytical abilities and a strong intellectual curiosity. In-depth knowledge of relational database design, data warehousing and dimensional data modeling concepts Understanding of REST and good API design Experience working with Apache Iceberg, Delta tables and distributed computing frameworks Strong collaboration, teamwork skills, excellent written and verbal communications skills Self-starter and motivated with ability to work in a fast-paced development environment Agile experience highly desirable Proficiency in the development environment, including IDE, database server, GIT, Continuous Integration, unit-testing tool, and defect management tools Preferred Skills Strong Knowledge of Data Engineering concepts (Data pipelines creation, Data Warehousing, Data Marts/Cubes, Data Reconciliation and Audit, Data Management) Strong working knowledge of Snowflake, including warehouse management, Snowflake SQL, and data sharing techniques Experience building pipelines that source from or deliver data into Snowflake in combination with tools like ADF and Databricks Working Knowledge of Dev-Ops processes (CI/CD), Git/Jenkins version control tool, Master Data Management (MDM) and Data Quality tools Strong Experience in ETL/ELT development, QA and operation/support process (RCA of production issues, Code/Data Fix Strategy, Monitoring and maintenance) Hands on experience in Databases like (Azure SQL DB, MySQL/, Cosmos DB etc.), File system (Blob Storage), Python/Unix shell Scripting ADF, Databricks and Azure certification is a plus Technologies we use : Databricks, Azure SQL DW/Synapse, Azure Tabular, Azure Data Factory, Azure Functions, Azure Containers, Docker, DevOps, Python, PySpark, Scripting (Powershell, Bash), Git, Terraform, Power BI, Snowflake #LI-DS1

Posted 2 days ago

Apply

10.0 years

2 - 8 Lacs

Gurgaon

On-site

GlassDoor logo

Requisition Number: 101362 Architect II Location: The role will be a hybrid position located in Delhi NCR, Hyderabad, Pune, Trivandrum and Bangalore, India Insight at a Glance 14,000+ engaged teammates globally #20 on Fortune’s World's Best Workplaces™ list $9.2 billion in revenue Received 35+ industry and partner awards in the past year $1.4M+ total charitable contributions in 2023 by Insight globally Now is the time to bring your expertise to Insight. We are not just a tech company; we are a people-first company. We believe that by unlocking the power of people and technology, we can accelerate transformation and achieve extraordinary results. As a Fortune 500 Solutions Integrator with deep expertise in cloud, data, AI, cybersecurity, and intelligent edge, we guide organisations through complex digital decisions. About the role The Architect-II Data will focus on leading our Business Intelligence (BI) and Data Warehousing (DW) initiatives. This role involves designing and implementing end-to-end data pipelines using cloud services and data frameworks. They will collaborate with stakeholders and ETL/BI developers in an agile environment to create scalable, secure data architectures ensuring alignment with business requirements, industry best practices, and regulatory compliance. Responsibilities: Architect and implement end-to-end data pipelines, data lakes, and warehouses using modern cloud services and architectural patterns. Develop and build analytics tools that deliver actionable insights to the business. Integrate and manage large, complex data sets to meet strategic business requirements. Optimize data processing workflows using frameworks such as PySpark. Establish and enforce best practices for data quality, integrity, security, and performance across the entire data ecosystem. Collaborate with cross-functional teams to prioritize deliverables and design solutions. Develop compelling business cases and return on investment (ROI) analyses to support strategic initiatives. Drive process improvements for enhanced data delivery speed and reliability. Provide technical leadership, training, and mentorship to team members, promoting a culture of excellence. Qualification: 10+ years in Business Intelligence (BI) solution design, with 8+ years specializing in ETL processes and data warehouse architecture. 8+ years of hands-on experience with Azure Data services including Azure Data Factory, Azure Databricks, Azure Data Lake Gen2, Azure SQL DB, Synapse, Power BI, and MS Fabric (Knowledge) Strong Python and PySpark software engineering proficiency, coupled with a proven track record of building and optimizing big data pipelines, architectures, and datasets. Proficient in transforming, processing, and extracting insights from vast, disparate datasets, and building robust data pipelines for metadata, dependency, and workload management. Familiarity with software development lifecycles/methodologies, particularly Agile. Experience with SAP/ERP/Datasphere data modeling is a significant plus. Excellent presentation and collaboration skills, capable of creating formal documentation and supporting cross-functional teams in a dynamic environment. Strong problem-solving, time management, and organizational abilities. Keen to learn new languages and technologies continually. Graduate degree in Computer Science, Statistics, Informatics, Information Systems, or an equivalent field. What you can expect We’re legendary for taking care of you, your family and to help you engage with your local community. We want you to enjoy a full, meaningful life and own your career at Insight. Some of our benefits include: Freedom to work from another location—even an international destination—for up to 30 consecutive calendar days per year. But what really sets us apart are our core values of Hunger, Heart, and Harmony, which guide everything we do, from building relationships with teammates, partners, and clients to making a positive impact in our communities. Join us today, your ambITious journey starts here. Insight is an equal opportunity employer, and all qualified applicants will receive consideration for employment without regard to race, color, religion, sex, national origin, disability status, protected veteran status, sexual orientation or any other characteristic protected by law. When you apply, please tell us the pronouns you use and any reasonable adjustments you may need during the interview process. At Insight, we celebrate diversity of skills and experience so even if you don’t feel like your skills are a perfect match - we still want to hear from you! Insight is an equal opportunity employer, and all qualified applicants will receive consideration for employment without regard to race, color, religion, sex, national origin, disability status, protected veteran status, sexual orientation or any other characteristic protected by law. Insight India Location:Level 16, Tower B, Building No 14, Dlf Cyber City In It/Ites Sez, Sector 24 &25 A Gurugram Gurgaon Hr 122002 India

Posted 2 days ago

Apply

4.0 - 8.0 years

25 - 30 Lacs

Pune

Hybrid

Naukri logo

So, what’s t he r ole all about? As a Data Engineer, you will be responsible for designing, building, and maintaining large-scale data systems, as well as working with cross-functional teams to ensure efficient data processing and integration. You will leverage your knowledge of Apache Spark to create robust ETL processes, optimize data workflows, and manage high volumes of structured and unstructured data. How will you make an impact? Design, implement, and maintain data pipelines using Apache Spark for processing large datasets. Work with data engineering teams to optimize data workflows for performance and scalability. Integrate data from various sources, ensuring clean, reliable, and high-quality data for analysis. Develop and maintain data models, databases, and data lakes. Build and manage scalable ETL solutions to support business intelligence and data science initiatives. Monitor and troubleshoot data processing jobs, ensuring they run efficiently and effectively. Collaborate with data scientists, analysts, and other stakeholders to understand business needs and deliver data solutions. Implement data security best practices to protect sensitive information. Maintain a high level of data quality and ensure timely delivery of data to end-users. Continuously evaluate new technologies and frameworks to improve data engineering processes. Have you got what it takes? 8-11 years of experience as a Data Engineer, with a strong focus on Apache Spark and big data technologies. Expertise in Spark SQL , DataFrames , and RDDs for data processing and analysis. Proficient in programming languages such as Python , Scala , or Java for data engineering tasks. Hands-on experience with cloud platforms like AWS , specifically with data processing and storage services (e.g., S3 , BigQuery , Redshift , Databricks ). Experience with ETL frameworks and tools such as Apache Kafka , Airflow , or NiFi . Strong knowledge of data warehousing concepts and technologies (e.g., Redshift , Snowflake , BigQuery ). Familiarity with containerization technologies like Docker and Kubernetes . Knowledge of SQL and relational databases, with the ability to design and query databases effectively. Solid understanding of distributed computing, data modeling, and data architecture principles. Strong problem-solving skills and the ability to work with large and complex datasets. Excellent communication and collaboration skills to work effectively with cross-functional teams. You will have an advantage if you also have: Knowledge of SQL and relational databases, with the ability to design and query databases effectively. Solid understanding of distributed computing, data modeling, and data architecture principles. Strong problem-solving skills and the ability to work with large and complex datasets. What’s in it for you? Join an ever-growing, market disrupting, global company where the teams – comprised of the best of the best – work in a fast-paced, collaborative, and creative environment! As the market leader, every day at NiCE is a chance to learn and grow, and there are endless internal career opportunities across multiple roles, disciplines, domains, and locations. If you are passionate, innovative, and excited to constantly raise the bar, you may just be our next NiCEr! Enjoy NiCE-FLEX! At NiCE, we work according to the NiCE-FLEX hybrid model, which enables maximum flexibility: 2 days working from the office and 3 days of remote work, each week. Naturally, office days focus on face-to-face meetings, where teamwork and collaborative thinking generate innovation, new ideas, and a vibrant, interactive atmosphere. Requisition ID: 7235 Reporting into: Tech Manager Role Type: Individual Contributor

Posted 2 days ago

Apply

0 years

0 Lacs

Gurgaon

On-site

GlassDoor logo

Must have Strong Postgres DB Knowledge . Writing procedures and functions ,Writing dynamic code , Performance tuning in PostgreSQL and complex queries , UNIX. Good to have : IDMC or any other ETL tool knowledge, Airflow DAG , python , MS calls. About Virtusa Teamwork, quality of life, professional and personal development: values that Virtusa is proud to embody. When you join us, you join a team of 27,000 people globally that cares about your growth — one that seeks to provide you with exciting projects, opportunities and work with state of the art technologies throughout your career with us. Great minds, great potential: it all comes together at Virtusa. We value collaboration and the team environment of our company, and seek to provide great minds with a dynamic place to nurture new ideas and foster excellence. Virtusa was founded on principles of equal opportunity for all, and so does not discriminate on the basis of race, religion, color, sex, gender identity, sexual orientation, age, non-disqualifying physical or mental disability, national origin, veteran status or any other basis covered by appropriate law. All employment is decided on the basis of qualifications, merit, and business need.

Posted 2 days ago

Apply

0 years

0 Lacs

Gurgaon

On-site

GlassDoor logo

Job Description: We are seeking a skilled PL/SQL Developer with hands-on experience in the Insurance domain , especially with Ingenium (Policy Administration System). The ideal candidate will support and enhance legacy systems, contribute to data migration projects, and collaborate closely with business and technical teams to ensure seamless insurance operations. Key Responsibilities: Develop and maintain complex PL/SQL scripts, procedures, triggers, and packages. Work on enhancements, bug fixes, and performance tuning of Oracle-based insurance applications. Support and maintain Ingenium PAS for life insurance policies. Participate in data analysis, ETL processing, and migration activities from Ingenium. Collaborate with business analysts, QA teams, and end-users to deliver solutions aligned with business needs. Document technical specifications and workflows for future reference. Required Skills: Strong hands-on experience in Oracle PL/SQL development. Experience working with Ingenium (Life/Annuities Policy Administration System). Understanding of insurance products like life, annuities, riders, underwriting, and claims. Experience with batch processing, UAT support , and production issue resolution. Familiarity with SDLC methodologies, Agile/Scrum is a plus. Preferred Qualifications: Knowledge of mainframe/COBOL systems is a plus (if Ingenium is on mainframe). Experience in data migration projects involving Ingenium. Bachelor's degree in Computer Science or related field. Job Type: Full-time Pay: ₹100,000.00 - ₹1,100,000.00 per month Schedule: Day shift Work Location: In person

Posted 2 days ago

Apply

Exploring ETL Jobs in India

The ETL (Extract, Transform, Load) job market in India is thriving with numerous opportunities for job seekers. ETL professionals play a crucial role in managing and analyzing data effectively for organizations across various industries. If you are considering a career in ETL, this article will provide you with valuable insights into the job market in India.

Top Hiring Locations in India

  1. Bangalore
  2. Mumbai
  3. Pune
  4. Hyderabad
  5. Chennai

These cities are known for their thriving tech industries and often have a high demand for ETL professionals.

Average Salary Range

The average salary range for ETL professionals in India varies based on experience levels. Entry-level positions typically start at around ₹3-5 lakhs per annum, while experienced professionals can earn upwards of ₹10-15 lakhs per annum.

Career Path

In the ETL field, a typical career path may include roles such as: - Junior ETL Developer - ETL Developer - Senior ETL Developer - ETL Tech Lead - ETL Architect

As you gain experience and expertise, you can progress to higher-level roles within the ETL domain.

Related Skills

Alongside ETL, professionals in this field are often expected to have skills in: - SQL - Data Warehousing - Data Modeling - ETL Tools (e.g., Informatica, Talend) - Database Management Systems (e.g., Oracle, SQL Server)

Having a strong foundation in these related skills can enhance your capabilities as an ETL professional.

Interview Questions

Here are 25 interview questions that you may encounter in ETL job interviews:

  • What is ETL and why is it important? (basic)
  • Explain the difference between ETL and ELT processes. (medium)
  • How do you handle incremental loads in ETL processes? (medium)
  • What is a surrogate key in the context of ETL? (basic)
  • Can you explain the concept of data profiling in ETL? (medium)
  • How do you handle data quality issues in ETL processes? (medium)
  • What are some common ETL tools you have worked with? (basic)
  • Explain the difference between a full load and an incremental load. (basic)
  • How do you optimize ETL processes for performance? (medium)
  • Can you describe a challenging ETL project you worked on and how you overcame obstacles? (advanced)
  • What is the significance of data cleansing in ETL? (basic)
  • How do you ensure data security and compliance in ETL processes? (medium)
  • Have you worked with real-time data integration in ETL? If so, how did you approach it? (advanced)
  • What are the key components of an ETL architecture? (basic)
  • How do you handle data transformation requirements in ETL processes? (medium)
  • What are some best practices for ETL development? (medium)
  • Can you explain the concept of change data capture in ETL? (medium)
  • How do you troubleshoot ETL job failures? (medium)
  • What role does metadata play in ETL processes? (basic)
  • How do you handle complex transformations in ETL processes? (medium)
  • What is the importance of data lineage in ETL? (basic)
  • Have you worked with parallel processing in ETL? If so, explain your experience. (advanced)
  • How do you ensure data consistency across different ETL jobs? (medium)
  • Can you explain the concept of slowly changing dimensions in ETL? (medium)
  • How do you document ETL processes for knowledge sharing and future reference? (basic)

Closing Remarks

As you explore ETL jobs in India, remember to showcase your skills and expertise confidently during interviews. With the right preparation and a solid understanding of ETL concepts, you can embark on a rewarding career in this dynamic field. Good luck with your job search!

cta

Start Your Job Search Today

Browse through a variety of job opportunities tailored to your skills and preferences. Filter by location, experience, salary, and more to find your perfect fit.

Job Application AI Bot

Job Application AI Bot

Apply to 20+ Portals in one click

Download Now

Download the Mobile App

Instantly access job listings, apply easily, and track applications.

Featured Companies