Get alerts for new jobs matching your selected skills, preferred locations, and experience range.
3.0 years
0 Lacs
Hyderābād
On-site
JOB DESCRIPTION You’re ready to gain the skills and experience needed to grow within your role and advance your career — and we have the perfect software engineering opportunity for you. As a Data Engineer III at JPMorgan Chase within the Consumer & Community Banking Technology Team, you are part of an agile team that works to enhance, design, and deliver the software components of the firm’s state-of-the-art technology products in a secure, stable, and scalable way. As an emerging member of a software engineering team, you execute software solutions through the design, development, and technical troubleshooting of multiple components within a technical product, application, or system, while gaining the skills and experience needed to grow within your role. Job responsibilities Executes standard software solutions, design, development, and technical troubleshooting Writes secure and high-quality code using the syntax of at least one programming language with limited guidance Designs, develops, codes, and troubleshoots with consideration of upstream and downstream systems and technical implications Applies knowledge of tools within the Software Development Life Cycle toolchain to improve the value realized by automation Applies technical troubleshooting to break down solutions and solve technical problems of basic complexity Gathers, analyzes, synthesizes, and develops visualizations and reporting from large, diverse data sets in service of continuous improvement of software applications and systems. Proactively identifies hidden problems and patterns in data and uses these insights to drive improvements to coding hygiene and system architecture. Design & develop data pipelines end to end using PySpark, Java, Python and AWS Services. Utilize Container Orchestration services including Kubernetes, and a variety of AWS tools and services. Learns and applies system processes, methodologies, and skills for the development of secure, stable code and systems Adds to team culture of diversity, equity, inclusion, and respect Required qualifications, capabilities, and skills Formal training or certification on software engineering concepts and 3+ years of applied experience. Hands-on practical experience in system design, application development, testing, and operational stability Experience in developing, debugging, and maintaining code in a large corporate environment with one or more modern programming languages and database querying languages Hands-on practical experience in developing spark-based Frameworks for end-to-end ETL, ELT & reporting solutions using key components like Spark & Spark Streaming. Proficient in coding in one or more Coding languages – Core Java, Python and PySpark Experience with Relational and Datawarehouse databases, Cloud implementation experience with AWS including: AWS Data Services: Proficiency in Lake formation, Glue ETL (or) EMR, S3, Glue Catalog, Athena, Airflow (or) Lambda + Step Functions + Event Bridge, ECS Cluster and ECS Apps Data De/Serialization: Expertise in at least 2 of the formats: Parquet, Iceberg, AVRO, JSON AWS Data Security: Good Understanding of security concepts such as: Lake formation, IAM, Service roles, Encryption, KMS, Secrets Manager Proficiency in automation and continuous delivery methods. Preferred qualifications, capabilities, and skills Experience in Snowflake nice to have. Solid understanding of agile methodologies such as CI/CD, Applicant Resiliency, and Security. In-depth knowledge of the financial services industry and their IT systems. Practical cloud native experience preferably AWS. ABOUT US JPMorganChase, one of the oldest financial institutions, offers innovative financial solutions to millions of consumers, small businesses and many of the world’s most prominent corporate, institutional and government clients under the J.P. Morgan and Chase brands. Our history spans over 200 years and today we are a leader in investment banking, consumer and small business banking, commercial banking, financial transaction processing and asset management. We recognize that our people are our strength and the diverse talents they bring to our global workforce are directly linked to our success. We are an equal opportunity employer and place a high value on diversity and inclusion at our company. We do not discriminate on the basis of any protected attribute, including race, religion, color, national origin, gender, sexual orientation, gender identity, gender expression, age, marital or veteran status, pregnancy or disability, or any other basis protected under applicable law. We also make reasonable accommodations for applicants’ and employees’ religious practices and beliefs, as well as mental health or physical disability needs. Visit our FAQs for more information about requesting an accommodation. ABOUT THE TEAM Our Consumer & Community Banking division serves our Chase customers through a range of financial services, including personal banking, credit cards, mortgages, auto financing, investment advice, small business loans and payment processing. We’re proud to lead the U.S. in credit card sales and deposit growth and have the most-used digital solutions – all while ranking first in customer satisfaction.
Posted 1 day ago
0 years
0 Lacs
Telangana
On-site
About Chubb Chubb is a world leader in insurance. With operations in 54 countries and territories, Chubb provides commercial and personal property and casualty insurance, personal accident and supplemental health insurance, reinsurance and life insurance to a diverse group of clients. The company is defined by its extensive product and service offerings, broad distribution capabilities, exceptional financial strength and local operations globally. Parent company Chubb Limited is listed on the New York Stock Exchange (NYSE: CB) and is a component of the S&P 500 index. Chubb employs approximately 40,000 people worldwide. Additional information can be found at: www.chubb.com. About Chubb India At Chubb India, we are on an exciting journey of digital transformation driven by a commitment to engineering excellence and analytics. We are proud to share that we have been officially certified as a Great Place to Work® for the third consecutive year, a reflection of the culture at Chubb where we believe in fostering an environment where everyone can thrive, innovate, and grow With a team of over 2500 talented professionals, we encourage a start-up mindset that promotes collaboration, diverse perspectives, and a solution-driven attitude. We are dedicated to building expertise in engineering, analytics, and automation, empowering our teams to excel in a dynamic digital landscape. We offer an environment where you will be part of an organization that is dedicated to solving real-world challenges in the insurance industry. Together, we will work to shape the future through innovation and continuous learning. Key Responsibilities: Design, implement, and maintain database systems using SQL and Azure Synapse Analytics. Monitor database performance, implement changes, and apply new patches and versions when required. Ensure data integrity and security by implementing and managing appropriate access controls and backup/recovery procedures. Collaborate with development teams to design and optimize database queries and structures. Troubleshoot and resolve database issues, ensuring minimal downtime and data loss. Develop and maintain documentation related to database configurations, processes, and service records. Assist in the design and implementation of data warehousing solutions using Azure Synapse. Provide support for data migration and integration projects. Stay updated with the latest industry trends and technologies to ensure our database systems are current and efficient. Qualifications: Bachelor’s degree in Computer Science, Information Technology, or a related field. Proven experience as a Database Administrator with a focus on SQL and Azure Synapse Analytics. Strong knowledge of database structure systems and data mining. Experience with database management tools and software. Excellent problem-solving skills and ability to work independently. Strong communication skills to collaborate effectively with team members and stakeholders. Familiarity with cloud-based database solutions and services, particularly within the Azure ecosystem. Preferred Skills: Experience with other database technologies such as Oracle, MySQL, or PostgreSQL. Knowledge of data warehousing concepts and ETL processes. Certification in SQL Server or Azure Synapse Analytics is a plus. Why Chubb? Join Chubb to be part of a leading global insurance company! Our constant focus on employee experience along with a start-up-like culture empowers you to achieve impactful results. Industry leader: Chubb is a world leader in the insurance industry, powered by underwriting and engineering excellence A Great Place to work: Chubb India has been recognized as a Great Place to Work® for the years 2023-2024, 2024-2025 and 2025-2026 Laser focus on excellence: At Chubb we pride ourselves on our culture of greatness where excellence is a mindset and a way of being. We constantly seek new and innovative ways to excel at work and deliver outstanding results Start-Up Culture: Embracing the spirit of a start-up, our focus on speed and agility enables us to respond swiftly to market requirements, while a culture of ownership empowers employees to drive results that matter Growth and success: As we continue to grow, we are steadfast in our commitment to provide our employees with the best work experience, enabling them to advance their careers in a conducive environment Employee Benefits Our company offers a comprehensive benefits package designed to support our employees’ health, well-being, and professional growth. Employees enjoy flexible work options, generous paid time off, and robust health coverage, including treatment for dental and vision related requirements. We invest in the future of our employees through continuous learning opportunities and career advancement programs, while fostering a supportive and inclusive work environment. Our benefits include: Savings and Investment plans: We provide specialized benefits like Corporate NPS (National Pension Scheme), Employee Stock Purchase Plan (ESPP), Long-Term Incentive Plan (LTIP), Retiral Benefits and Car Lease that help employees optimally plan their finances Upskilling and career growth opportunities: With a focus on continuous learning, we offer customized programs that support upskilling like Education Reimbursement Programs, Certification programs and access to global learning programs. Health and Welfare Benefits: We care about our employees’ well-being in and out of work and have benefits like Employee Assistance Program (EAP), Yearly Free Health campaigns and comprehensive Insurance benefits. Application Process Our recruitment process is designed to be transparent, and inclusive. Step 1: Submit your application via the Chubb Careers Portal. Step 2: Engage with our recruitment team for an initial discussion. Step 3: Participate in HackerRank assessments/technical/functional interviews and assessments (if applicable). Step 4: Final interaction with Chubb leadership. Join Us With you Chubb is better. Whether you are solving challenges on a global stage or creating innovative solutions for local markets, your contributions will help shape the future. If you value integrity, innovation, and inclusion, and are ready to make a difference, we invite you to be part of Chubb India’s journey. Apply Now: Chubb External Careers TBD
Posted 1 day ago
2.0 years
0 Lacs
Telangana
On-site
Design and develop QlikView and Qlik Sense dashboards and reports. Collaborate with business stakeholders to gather and understand requirements. Perform data extraction, transformation, and loading (ETL) processes. Optimize Qlik applications for performance and usability. Ensure data accuracy and consistency across all BI solutions. Conduct testing and validation of Qlik applications. Provide ongoing support and troubleshooting for Qlik solutions. Stay up-to-date with the latest Qlik technologies and industry trends. Qualifications Bachelor’s degree in Computer Science, Information Technology, or related field. 2+ years of experience in Qlik development (QlikView and Qlik Sense). Strong understanding of data visualization best practices. Proficiency in SQL and data modeling. Experience with ETL processes and tools. Excellent problem-solving and analytical skills. Strong communication and interpersonal skills. Design and develop QlikView and Qlik Sense dashboards and reports. Collaborate with business stakeholders to gather and understand requirements. Perform data extraction, transformation, and loading (ETL) processes. Optimize Qlik applications for performance and usability. Ensure data accuracy and consistency across all BI solutions. Conduct testing and validation of Qlik applications. Provide ongoing support and troubleshooting for Qlik solutions. Stay up-to-date with the latest Qlik technologies and industry trends. Qualifications Bachelor’s degree in Computer Science, Information Technology, or related field. 2+ years of experience in Qlik development (QlikView and Qlik Sense). Strong understanding of data visualization best practices. Proficiency in SQL and data modeling. Experience with ETL processes and tools. Excellent problem-solving and analytical skills. Strong communication and interpersonal skills.
Posted 1 day ago
15.0 years
0 Lacs
Hyderābād
On-site
Project Role : Software Development Lead Project Role Description : Develop and configure software systems either end-to-end or for a specific stage of product lifecycle. Apply knowledge of technologies, applications, methodologies, processes and tools to support a client, project or entity. Must have skills : SAP BusinessObjects Data Services Good to have skills : NA Minimum 7.5 year(s) of experience is required Educational Qualification : 15 years full time education Summary: As a Software Development Lead, you will develop and configure software systems, either end-to-end or for specific stages of the product lifecycle. Your typical day will involve collaborating with various teams to ensure the successful implementation of software solutions, applying your knowledge of technologies and methodologies to support projects and clients effectively. You will engage in problem-solving activities, guiding your team through challenges while ensuring that project goals are met efficiently and effectively. Roles & Responsibilities: - Expected to be an SME. - Collaborate and manage the team to perform. - Responsible for team decisions. - Engage with multiple teams and contribute on key decisions. - Provide solutions to problems for their immediate team and across multiple teams. - Facilitate knowledge sharing sessions to enhance team capabilities. - Monitor project progress and ensure alignment with strategic objectives. Professional & Technical Skills: - Must To Have Skills: Proficiency in SAP BusinessObjects Data Services. - Strong understanding of data integration and transformation processes. - Experience with ETL (Extract, Transform, Load) methodologies. - Familiarity with database management systems and SQL. - Ability to troubleshoot and optimize data workflows. Additional Information: - The candidate should have minimum 7.5 years of experience in SAP BusinessObjects Data Services. - This position is based at our Hyderabad office. - A 15 years full time education is required. 15 years full time education
Posted 1 day ago
5.0 years
6 - 9 Lacs
Hyderābād
Remote
Job Description Role Overview: A Data Engineer is responsible for designing, building, and maintaining robust data pipelines and infrastructure that facilitate the collection, storage, and processing of large datasets. They collaborate with data scientists and analysts to ensure data is accessible, reliable, and optimized for analysis. Key tasks include data integration, ETL (Extract, Transform, Load) processes, and managing databases and cloud-based systems. Data engineers play a crucial role in enabling data-driven decision-making and ensuring data quality across organizations. What will you do in this role: Develop comprehensive High-Level Technical Design and Data Mapping documents to meet specific business integration requirements. Own the data integration and ingestion solutions throughout the project lifecycle, delivering key artifacts such as data flow diagrams and source system inventories. Provide end-to-end delivery ownership for assigned data pipelines, performing cleansing, processing, and validation on the data to ensure its quality. Define and implement robust Test Strategies and Test Plans, ensuring end-to-end accountability for middleware testing and evidence management. Collaborate with the Solutions Architecture and Business analyst teams to analyze system requirements and prototype innovative integration methods. Exhibit a hands-on leadership approach, ready to engage in coding, debugging, and all necessary actions to ensure the delivery of high-quality, scalable products. Influence and drive cross-product teams and collaboration while coordinating the execution of complex, technology-driven initiatives within distributed and remote teams. Work closely with various platforms and competencies to enrich the purpose of Enterprise Integration and guide their roadmaps to address current and emerging data integration and ingestion capabilities. Design ETL/ELT solutions, lead comprehensive system and integration testing, and outline standards and architectural toolkits to underpin our data integration efforts. Analyze data requirements and translate them into technical specifications for ETL processes. Develop and maintain ETL workflows, ensuring optimal performance and error handling mechanisms are in place. Monitor and troubleshoot ETL processes to ensure timely and successful data delivery. Collaborate with data analyst and other stakeholders to ensure alignment between data architecture and integration strategies. Document integration processes, data mappings, and ETL workflows to maintain clear communication and ensure knowledge transfer. What should you have: Bachelor’s degree in information technology, Computer Science or any Technology stream 5+ years of working experience with enterprise data integration technologies – Informatica PowerCenter, Informatica Intelligent Data Management Cloud Services (CDI, CAI, Mass Ingest, Orchestration) Integration experience utilizing REST and Custom API integration Experiences in Relational Database technologies and Cloud Data stores from AWS, GCP & Azure Experience utilizing AWS cloud well architecture framework, deployment & integration and data engineering. Preferred experience with CI/CD processes and related tools including- Terraform, GitHub Actions, Artifactory etc. Proven expertise in Python and Shell scripting, with a strong focus on leveraging these languages for data integration and orchestration to optimize workflows and enhance data processing efficiency Extensive Experience in design of reusable integration pattern using the cloud native technologies Extensive Experience Process orchestration and Scheduling Integration Jobs in Autosys, Airflow. Experience in Agile development methodologies and release management techniques Excellent analytical and problem-solving skills Good Understanding of data modeling and data architecture principles Current Employees apply HERE Current Contingent Workers apply HERE Search Firm Representatives Please Read Carefully Merck & Co., Inc., Rahway, NJ, USA, also known as Merck Sharp & Dohme LLC, Rahway, NJ, USA, does not accept unsolicited assistance from search firms for employment opportunities. All CVs / resumes submitted by search firms to any employee at our company without a valid written search agreement in place for this position will be deemed the sole property of our company. No fee will be paid in the event a candidate is hired by our company as a result of an agency referral where no pre-existing agreement is in place. Where agency agreements are in place, introductions are position specific. Please, no phone calls or emails. Employee Status: Regular Relocation: VISA Sponsorship: Travel Requirements: Flexible Work Arrangements: Hybrid Shift: Valid Driving License: Hazardous Material(s): Required Skills: Business, Business Intelligence (BI), Database Administration, Data Engineering, Data Management, Data Modeling, Data Visualization, Design Applications, Information Management, Management Process, Social Collaboration, Software Development, Software Development Life Cycle (SDLC), System Designs Preferred Skills: Job Posting End Date: 07/31/2025 A job posting is effective until 11:59:59PM on the day BEFORE the listed job posting end date. Please ensure you apply to a job posting no later than the day BEFORE the job posting end date. Requisition ID: R353285
Posted 1 day ago
3.0 years
2 - 7 Lacs
Hyderābād
On-site
About Us: Location - Hyderabad, India Department - Product R&D Level - Professional Working Pattern - Work from office. Benefits - Benefits at Ideagen DEI - DEI strategy Salary - this will be discussed at the next stage of the process, if you do have any questions, please feel free to reach out! We are seeking an experienced Data Engineer who is having strong problem solving and analytical skills, high attention to detail, passion for analytics, real-time data, and monitoring and critical Thinking and collaboration skills. The candidate should be a self-starter and a quick learner, ready to learn new technologies and tools that the job demands. Responsibilities: Building automated pipelines and solutions for data migration/data import or other operations requiring data ETL. Performing analysis on core products to support migration planning and development. Working closely with the Team Lead and collaborating with other stakeholders to gather requirements and build well architected data solutions. Produce supporting documentation, such as specifications, data models, relation between data and others, required for the effective development, usage and communication of the data operations solutions with different stakeholders. Competencies, Characteristics and Traits: Mandatory Skills - Minimum 3 years of Experience with SnapLogic pipeline development and building a minimum of 2 years in ETL/ELT Pipelines. Experience working with databases on-premises and/or cloud-based environments such as MSSQL, MySQL, PostgreSQL, AzureSQL, Aurora MySQL & PostgreSQL, AWS RDS etc. Experience working with API sources and destinations. Skills and Experience: Essential: Strong experience working with databases on-premises and/or cloud-based environments such as MSSQL, MySQL, PostgreSQL, AzureSQL, Aurora MySQL & PostgreSQL, AWS RDS etc Strong knowledge of databases, data modeling and data life cycle Proficient in understanding data and writing complex SQL Mandatory Skills - Minimum 3 years of Experience with SnapLogic pipeline development and building a minimum 2 years in ETL/ELT Pipelines Experience working with REST API in data pipelines Strong problem solving and high attention to detail Passion for analytics, real-time data, and monitoring Critical Thinking, good communication and collaboration skills Focus on high performance and quality delivery Highly self-motivated and continuous learner Desirable: Experience working with no-SQL databases like MongoDB Experience with Snaplogic administration is preferable Experience working with Microsoft Power Platform (PowerAutomate and PowerApps) or any similar automation / RPA tool Experience with cloud data platforms like snowflake, data bricks, AWS, Azure etc Awareness of emerging ETL and Cloud concepts such as Amazon AWS or Microsoft Azure Experience working with Scripting languages, such as Python, R, JavaScript, etc. About Ideagen Ideagen is the invisible force behind many things we rely on every day - from keeping airplanes soaring in the sky, to ensuring the food on our tables is safe, to helping doctors and nurses care for the sick. So, when you think of Ideagen, think of it as the silent teammate that's always working behind the scenes to help those people who make our lives safer and better. Everyday millions of people are kept safe using Ideagen software. We have offices all over the world including America, Australia, Malaysia and India with people doing lots of different and exciting jobs. What is next? If your application meets the requirements for this role, our Talent Acquisition team will be in touch to guide you through the next steps. To ensure a flexible and inclusive process, please let us know if you require any reasonable adjustments by contacting us at recruitment@ideagen.com. All matters will be treated with strict confidence. At Ideagen, we value the importance of work-life balance and welcome candidates seeking flexible or part-time working arrangements. If this is something you are interested in, please let us know during the application process. Enhance your career and make the world a safer place!
Posted 1 day ago
4.0 - 8.0 years
0 Lacs
Hyderābād
On-site
We are seeking a skilled Data Engineer with strong experience in Azure Data Services, Databricks, SQL, and PySpark to join our data engineering team. The ideal candidate will be responsible for building robust and scalable data pipelines and solutions to support advanced analytics and business intelligence initiatives." Key Responsibilities: Design and implement scalable and secure data pipelines using Azure Data Factory, Databricks, and Synapse Analytics. Develop and maintain efficient ETL/ELT workflows into and within Databricks. Write complex SQL queries for data extraction, transformation, and analysis. Develop and optimize data transformation scripts using PySpark. Ensure data quality, data governance, and performance optimization across all pipelines. Collaborate with data architects, analysts, and business stakeholders to deliver reliable data solutions. Perform data modelling and design for both structured and semi-structured data. Monitor data pipelines and troubleshoot issues to ensure data integrity and timely delivery. Contribute to best practices in cloud data architecture and engineering. Required Skills: 4–8 years of experience in data engineering or related fields. Strong experience with Azure Data Services (ADF, Synapse, Databricks, Azure Storage). Proficient with Databricks data warehouse – including data ingestion, Snow pipe, streams & tasks. Advanced SQL skills, including performance tuning and complex query building. Hands-on experience with PySpark for large-scale data processing and transformation. Experience with ETL/ELT frameworks, orchestration, and scheduling. Familiarity with data modelling concepts (dimensional/star schema). Good understanding of data security, role-based access, and auditing in Snowflake and Azure. Preferred/Good to Have: Experience with CI/CD pipelines and DevOps for data workflows. Exposure to Power BI or similar BI tools. Familiarity with Git, Terraform, or infrastructure-as-code (IaC) in cloud environments. Experience with Agile/Scrum methodologies Job Type: Full-time Work Location: In person
Posted 1 day ago
7.0 years
5 - 23 Lacs
Hyderābād
On-site
Job Title: Senior Data Engineer Experience: 7+ years Location: Hyderabad, Telangana Time Zone: IST Primary Tech Stack: SQL, Query & Database Performance Tuning, ETL, Integrations & Data Transformations, Python Scripting, AWS Core Services (S3, Lambda, IAM) General Information: We are looking for exceptional Senior Data Engineers (SDEs) to play a significant role in building our large-scale, high-volume, high-performance data integration and delivery services. These data solutions would be primarily used in periodic reporting and drive business decision-making while dealing efficiently with the massive scale of data available through our Data Warehouse as well as our software systems. You will be responsible for designing and implementing solutions using third-party and in-house data processing tools, building dimensional data models, reports, and dashboards, integrating data across disparate & distributed systems, and administering the platform software. You are expected to analyze challenging Business Problems and build efficient, flexible, extensible, and scalable data models, ETL designs, and data integration services. You will also have an opportunity to build/maintain/enhance small to mid-size custom-built Applications using Python/Java. You are required to support and manage the growth of these data solutions. Job Description: As a Data Engineer, you will be working in one of the world's largest cloud-based data lakes. You should be skilled in the architecture of data warehouse solutions for the Enterprise using multiple platforms (EMR, RDBMS, Columnar, Cloud). You should have extensive experience in the design, creation, management, and business use of extremely large datasets. You should have excellent business and communication skills to be able to work with business owners to develop and define key business questions and to build data sets that answer those questions. Above all, you should be passionate about working with huge data volumes and someone who loves to bring datasets together to answer business questions to drive Business growth. Skills Needed SQL Expert Query & Database Performance Tuning Expert ETL, Integrations & Data Transformations Proficient Python Scripting Proficient AWS Core Services (S3, Lambda, IAM) Intermediate Job Type: Full-time Pay: ₹500,298.14 - ₹2,350,039.92 per year Work Location: In person
Posted 1 day ago
10.0 years
7 - 20 Lacs
India
On-site
About MostEdge At MostEdge , we’re on a mission to accelerate commerce and build sustainable, trusted experiences . Our slogan — Protect Every Penny. Power Every Possibility. —reflects our commitment to operational excellence, data integrity, and real-time intelligence that help retailers run smarter, faster, and stronger. Our systems are mission-critical and designed for 99.99999% uptime , powering millions of transactions and inventory updates daily . We work at the intersection of AI, microservices, and retail commerce—and we win as a team. Role Overview We are looking for a Senior Database Administrator (DBA) to own the design, implementation, scaling, and performance of our data infrastructure. You will be responsible for mission-critical OLTP systems spanning MariaDB, MySQL, PostgreSQL, and MongoDB , deployed across AWS, GCP, and containerized Kubernetes clusters . This role plays a key part in ensuring data consistency, security, and speed across billions of rows and real-time operations. Scope & Accountability What You Will Own Manage and optimize multi-tenant, high-availability databases for real-time inventory, pricing, sales, and vendor data. Design and maintain scalable, partitioned database architectures across SQL and NoSQL systems. Monitor and tune query performance and ensure fast recovery, replication, and backup practices. Partner with developers, analysts, and DevOps teams on schema design, ETL pipelines, and microservices integration . Maintain security best practices, audit logging, encryption standards, and data retention compliance . What Success Looks Like 99.99999% uptime maintained across all environments. <100ms query response times for large-scale datasets. Zero unplanned data loss or corruption incidents. Developer teams experience zero bottlenecks from DB-related delays. Skills & Experience Must-Have 10+ years of experience managing OLTP systems at scale. Strong hands-on with MySQL, MariaDB, PostgreSQL, and MongoDB . Proven expertise in replication, clustering, indexing, and sharding . Experience with Kubernetes-based deployments , Kafka queues , and Dockerized apps . Familiarity with AWS S3 storage , GCP services, and hybrid cloud data replication. Experience in startup environments with fast-moving agile teams. Track record of creating clear documentation and managing tasks via JIRA . Nice-to-Have Experience with AI/ML data pipelines , vector databases, or embedding stores. Exposure to infrastructure as code (e.g., Terraform, Helm). Familiarity with LangChain, FastAPI , or modern LLM-driven architectures. How You Reflect Our Values Lead with Purpose : You enable smarter, faster systems that empower our retail customers. Build Trust : You create safe, accurate, and recoverable environments. Own the Outcome : You take responsibility for uptime, audits, and incident resolution. Win Together : You collaborate seamlessly across product, ops, and engineering. Keep It Simple : You design intuitive schemas, efficient queries, and clear alerts. Why Join MostEdge? Work on high-impact systems powering real-time retail intelligence . Collaborate with a passionate, values-driven team across AI, engineering, and operations. Build at scale—with autonomy, ownership, and cutting-edge tech. Job Types: Full-time, Permanent Pay: ₹727,996.91 - ₹2,032,140.73 per year Benefits: Health insurance Life insurance Paid sick time Paid time off Provident Fund Schedule: Evening shift Morning shift US shift Supplemental Pay: Performance bonus Yearly bonus Work Location: In person Expected Start Date: 31/07/2025
Posted 1 day ago
5.0 years
10 - 27 Lacs
India
On-site
About MostEdge At MostEdge , our purpose is clear: Accelerate commerce and build sustainable, trusted experiences. With every byte of data, we strive to Protect Every Penny. Power Every Possibility. We empower retailers to make real-time, profitable decisions using cutting-edge AI , smart infrastructure, and operational excellence. Our platforms handle: hundreds of thousands of sales transactions/hour hundreds of vendor purchase invoices/hour few hundred product updates/day With systems built for 99.99999% uptime We are building an AI-native commerce engine , and language models are at the heart of this transformation. Role Overview We are looking for an AI/ML Expert with deep experience in training and deploying Large Language Models (LLMs) to power MostEdge's next-generation operations, cost intelligence, and customer analytics platform . You will be responsible for fine-tuning domain-specific models using internal structured and unstructured data (product catalogs, invoices, chats, documents), embedding real-time knowledge through RAG pipelines, and enabling AI-powered interfaces that drive search, reporting, insight generation, and operational recommendations. Scope & Accountability What You Will Own Fine-tune and deploy LLMs for product, vendor, and shopper-facing use cases. Design hybrid retrieval-augmented generation (RAG) pipelines with LangChain, FastAPI, and vector DBs (e.g., FAISS, Weaviate, Qdrant). Train models on internal datasets (sales, cost, product specs, invoices, support logs) using supervised fine-tuning and LoRA/QLoRA techniques. Orchestrate embedding pipelines, prompt tuning, and model evaluation across customer and field operations use cases. Deploy LLMs efficiently on RunPod, AWS, or GCP , optimizing for multi-GPU, low-latency inference . Collaborate with engineering and product teams to embed model outputs in dashboards, chat UIs, and retail systems. What Success Looks Like 90%+ accuracy on retrieval and reasoning tasks for product/vendor cost and invoice queries. <3s inference time across operational prompts, running on GPU-optimized containers. Full integration of LLMs with backend APIs, sales dashboards, and product portals. 75% reduction in manual effort across selected operational workflows. Skills & Experience Must-Have 5+ years in AI/ML , with 2+ years working on LLMs or transformer architectures . Proven experience training or fine-tuning Mistral, LLaMA, Falcon, or similar open-source LLMs . Strong command over LoRA, QLoRA, PEFT, RAG, embeddings, and quantized inference . Familiarity with LangChain, HuggingFace Transformers, FAISS/Qdrant , and FastAPI for LLM orchestration. Experience deploying models on RunPod, AWS, or GCP using Docker + Kubernetes. Proficient in Python , PyTorch , and data preprocessing (structured and unstructured). Experience with ETL pipelines , multi-modal data, and real-time data integration. Nice-to-Have Experience with retail, inventory, or customer analytics systems . Knowledge of semantic search, OCR post-processing, or auto-tagging pipelines . Exposure to multi-tenant environments and secure model isolation for enterprise use. How You Reflect Our Values Lead with Purpose : You empower smarter decisions with AI-first operations. Build Trust : You make model behavior explainable, dependable, and fair. Own the Outcome : You train and optimize end-to-end pipelines from data to insights. Win Together : You partner across engineering, ops, and customer success teams. Keep It Simple : You design intuitive models, prompts, and outputs that drive action—not confusion. Why Join MostEdge? Shape how AI transforms commerce and operations at scale . Be part of a mission-critical, high-velocity, AI-first company . Build LLMs with purpose—connecting frontline data to real-time results. Job Types: Full-time, Permanent Pay: ₹1,068,726.69 - ₹2,729,919.70 per year Benefits: Health insurance Life insurance Paid sick time Paid time off Provident Fund Schedule: Evening shift Morning shift US shift Supplemental Pay: Performance bonus Yearly bonus Work Location: In person Expected Start Date: 15/07/2025
Posted 1 day ago
3.0 years
0 Lacs
Hyderābād
On-site
- 3+ years of data engineering experience - Experience with data modeling, warehousing and building ETL pipelines - 4+ years of SQL experience - Experience in at least one modern scripting or programming language, such as Python, Java, Scala, or NodeJS - Experience as a data engineer or related specialty (e.g., software engineer, business intelligence engineer, data scientist) with a track record of manipulating, processing, and extracting value from large datasets Design, implement, and support data warehouse / data lake infrastructure using AWS big data stack, Python, Redshift, Quicksight, Glue/lake formation, EMR/Spark/Scala, Athena etc. • Extract huge volumes of structured and unstructured data from various sources (Relational /Non-relational/No-SQL database) and message streams and construct complex analyses. • Develop and manage ETLs to source data from various systems and create unified data model for analytics and reporting • Perform detailed source-system analysis, source-to-target data analysis, and transformation analysis • Participate in the full development cycle for ETL: design, implementation, validation, documentation, and maintenance. Experience with AWS technologies like Redshift, S3, AWS Glue, EMR, Kinesis, FireHose, Lambda, and IAM roles and permissions Experience with non-relational databases / data stores (object storage, document or key-value stores, graph databases, column-family databases) Experience building/operating highly available, distributed systems of data extraction, ingestion, and processing of large data sets Our inclusive culture empowers Amazonians to deliver the best results for our customers. If you have a disability and need a workplace accommodation or adjustment during the application and hiring process, including support for the interview or onboarding process, please visit https://amazon.jobs/content/en/how-we-hire/accommodations for more information. If the country/region you’re applying in isn’t listed, please contact your Recruiting Partner.
Posted 1 day ago
0 years
4 - 6 Lacs
Hyderābād
On-site
As an employee at Thomson Reuters, you will play a role in shaping and leading the global knowledge economy. Our technology drives global markets and helps professionals around the world make decisions that matter. As the world’s leading provider of intelligent information, we want your unique perspective to create the solutions that advance our business and your career.Our Service Management function is transforming into a truly global, data and standards-driven organization, employing best-in-class tools and practices across all disciplines of Technology Operations. This will drive ever-greater stability and consistency of service across the technology estate as we drive towards optimal Customer and Employee experience. About the role: In this opportunity as Application Support Analyst, you will: Experience on Informatica support. The engineer will be responsible for supporting Informatica Development, Extractions, and loading. Fixing the data discrepancies and take care of performance monitoring. Collaborate with stakeholders such as business teams, product owners, and project management in defining roadmaps for applications and processes. Drive continual service improvement and innovation in productivity, software quality, and reliability, including meeting/exceeding SLAs. Thorough understanding of ITIL processes related to incident management, problem management, application life cycle management, operational health management. Experience in supporting applications built on modern application architecture and cloud infrastructure, Informatica PowerCenter/IDQ, Javascript frameworks and Libraries, HTML/CSS/JS, Node.JS, TypeScript, jQuery, Docker, AWS/Azure. About You: You're a fit for the role of Application Support Analyst - Informatica if your background includes: 3 to 8+ experienced Informatica Developer and Support will be responsible for implementation of ETL methodology in Data Extraction, Transformation and Loading. Have Knowledge in ETL Design of new or changing mappings and workflows with the team and prepares technical specifications. Should have experience in creating ETL Mappings, Mapplets, Workflows, Worklets using Informatica PowerCenter 10.x and prepare corresponding documentation. Designs and builds integrations supporting standard data warehousing objects (type-2 dimensions, aggregations, star schema, etc.). Should be able to perform source system analysis as required. Works with DBAs and Data Architects to plan and implement appropriate data partitioning strategy in Enterprise Data Warehouse. Implements versioning of the ETL repository and supporting code as necessary. Develops stored procedures, database triggers and SQL queries where needed. Implements best practices and tunes SQL code for optimization. Loads data from SF Power Exchange to Relational database using Informatica. Works with XML's, XML parser, Java and HTTP transformation within Informatica. Experience in Integration of various data sources like Oracle, SQL Server, DB2 and Flat Files in various formats like fixed width, CSV, Salesforce and excel Manage. Have in depth knowledge and experience in implementing the best practices for design and development of data warehouses using Star schema & Snowflake schema design concepts. Experience in Performance Tuning of sources, targets, mappings, transformations, and sessions Carried out support and development activities in a relational database environment, designed tables, procedures/Functions, Packages, Triggers and Views in relational databases and used SQL proficiently in database programming using SNFL Thousand Coffees Thomson Reuters café networking. #LI-VGA1 What’s in it For You? Hybrid Work Model: We’ve adopted a flexible hybrid working environment (2-3 days a week in the office depending on the role) for our office-based roles while delivering a seamless experience that is digitally and physically connected. Flexibility & Work-Life Balance: Flex My Way is a set of supportive workplace policies designed to help manage personal and professional responsibilities, whether caring for family, giving back to the community, or finding time to refresh and reset. This builds upon our flexible work arrangements, including work from anywhere for up to 8 weeks per year, empowering employees to achieve a better work-life balance. Career Development and Growth: By fostering a culture of continuous learning and skill development, we prepare our talent to tackle tomorrow’s challenges and deliver real-world solutions. Our Grow My Way programming and skills-first approach ensures you have the tools and knowledge to grow, lead, and thrive in an AI-enabled future. Industry Competitive Benefits: We offer comprehensive benefit plans to include flexible vacation, two company-wide Mental Health Days off, access to the Headspace app, retirement savings, tuition reimbursement, employee incentive programs, and resources for mental, physical, and financial wellbeing. Culture: Globally recognized, award-winning reputation for inclusion and belonging, flexibility, work-life balance, and more. We live by our values: Obsess over our Customers, Compete to Win, Challenge (Y)our Thinking, Act Fast / Learn Fast, and Stronger Together. Social Impact: Make an impact in your community with our Social Impact Institute. We offer employees two paid volunteer days off annually and opportunities to get involved with pro-bono consulting projects and Environmental, Social, and Governance (ESG) initiatives. Making a Real-World Impact: We are one of the few companies globally that helps its customers pursue justice, truth, and transparency. Together, with the professionals and institutions we serve, we help uphold the rule of law, turn the wheels of commerce, catch bad actors, report the facts, and provide trusted, unbiased information to people all over the world. About Us Thomson Reuters informs the way forward by bringing together the trusted content and technology that people and organizations need to make the right decisions. We serve professionals across legal, tax, accounting, compliance, government, and media. Our products combine highly specialized software and insights to empower professionals with the data, intelligence, and solutions needed to make informed decisions, and to help institutions in their pursuit of justice, truth, and transparency. Reuters, part of Thomson Reuters, is a world leading provider of trusted journalism and news. We are powered by the talents of 26,000 employees across more than 70 countries, where everyone has a chance to contribute and grow professionally in flexible work environments. At a time when objectivity, accuracy, fairness, and transparency are under attack, we consider it our duty to pursue them. Sound exciting? Join us and help shape the industries that move society forward. As a global business, we rely on the unique backgrounds, perspectives, and experiences of all employees to deliver on our business goals. To ensure we can do that, we seek talented, qualified employees in all our operations around the world regardless of race, color, sex/gender, including pregnancy, gender identity and expression, national origin, religion, sexual orientation, disability, age, marital status, citizen status, veteran status, or any other protected classification under applicable law. Thomson Reuters is proud to be an Equal Employment Opportunity Employer providing a drug-free workplace. We also make reasonable accommodations for qualified individuals with disabilities and for sincerely held religious beliefs in accordance with applicable law. More information on requesting an accommodation here. Learn more on how to protect yourself from fraudulent job postings here. More information about Thomson Reuters can be found on thomsonreuters.com.
Posted 1 day ago
5.0 years
0 Lacs
Hyderābād
On-site
SUMMARY The Master Data Management (MDM) Administrator will play a critical role in engaging with stakeholders and technical team members to execute master data creation, maintenance, and governance for our MDM workstream. This position will play a crucial role in managing our master data, ensuring data consistency, and facilitating data-driven decision making. You will collaborate with various departments to ensure data accuracy, integrity, and compliance with established data standards. This position will play a key role in streamlining data-related processes, enhancing data quality, and promoting a data driven culture within the company.This role will report to the BEST Data Services Senior Manager in our Business Enterprise Systems Technology department. The successful candidate will take a hands-on approach and will be assisting developers and architects with our Master Data Management (MDM) platform and FaCT Data Foundations teams and processes. A successful MDM Administrator must take a hands-on approach, ensuring the highest quality solutions are provided to our business stakeholders, with accurate development, documentation, and adherence to deadlines. This role will also work with key stakeholders across the organization to drive enhancements to a successful implementation and ensure all master data meets requirements and are deployed and implemented properly. PRIMARY RESPONSIBILITIES Responsible for engaging with multiple teams (both technical and non) to understand master data requirements and objectives. Responsible for implementing and enforcing data governance policies and procedures to maintain the quality and integrity of master data. Responsible for performing data entry, validation, and maintenance tasks to ensure accuracy and consistency of master data records. Develop and maintain data standards and guidelines for various data elements to be used consistently across the organization. Assist in collaborating with multiple teams to define and implement various data structures and hierarchies with the Customer, Product, Pricing, and Supplier data domains. Identify and resolve data quality issues, including duplication, inconsistency, and inaccuracies. Facilitate data integration and migration projects, ensuring seamless data flows between systems. Maintain comprehensive documentation of data processes, standards, and best practices. Generate reports and analyze data quality metrics to monitor the effectiveness of data management efforts. Provide training and support to end-users on data entry and data management best practices. Ensure that master data management practices align with industry regulations and compliance requirements. Provide timely troubleshooting and support for master data related problems. Ensure data security and compliance with relevant regulations and internal policies. Responsible for ensuring there is alignment with business objectives. Responsible for identifying and resolve data discrepancies, ensuring data standards are met. Working closely with FaCT, IT, and business stakeholders to ensure seamless data migration. \ Effectively communicate project status, issues, and solutions to both technical and non-technical stakeholders. Maintain detailed documentation of data migration processes, decisions, and outcomes. Provide post migration support. Adopt a proactive, hands-on approach to resolve any issues related to these platforms. Collaborate with onshore and offshore business and technical teams to assist with creating solutions for complex internal business operations. Work closely with business partners to define strategies for technical solutions, determine requirements, and develop high-level designs. REQUIRED KNOWLEDGE/SKILLS/ABILITIES Minimum of 5 years of hands-on master data management administration experience with a focus on customer, pricing, and product data domains. Oracle CX-Sales(CDM), experience in VBCS with AR Module(O2C). Knowledge of Integration OIC & ATP is a plus. Knowledge of data quality and data profiling tools. Familiarity with data integration and ETL processes. Strong understanding of data structures, databases, and data integrations. Strong communication skills Proficient in designing and implementing process workflows and data diagrams. Proven Agile development experience, you can understand what Epics, Features and Stories are and can define one. Excellent problem solver and independent thinker who can create innovative solutions. Exceptional communication, analytical, and management skills, with the ability to present technical concepts to both business executives and technical teams. Able to manage daily stand-ups, escalations, issues, and risks. Self-directed, adaptable, empathetic, flexible, and forward-thinking. Strong organizational, interpersonal, and relationship-building skills conducive to collaboration. Passionate about technology, digital transformation, and business process reengineering.
Posted 1 day ago
3.0 - 6.0 years
0 Lacs
Hyderābād
On-site
We are seeking a talented and detail-oriented Data Analyst to join our Reporting Team. In this role, you will specialize in curating insightful and visually compelling reports using tools such as Power BI, Tableau, Python, Excel, and PowerPoint. A key component of this position is integrating AI solutions into our reporting processes to enhance data-driven decision-making for our stakeholders. Collaboration with stakeholders is essential to ensure our reporting solutions effectively meet their needs. If you are passionate about data visualization and leveraging AI technologies, we would love to hear from you! About the Role In this opportunity as a Data Analyst, you will: Develop, design, and maintain interactive and dynamic reports and dashboards using Power BI, Tableau, Excel, and PowerPoint. Collaborate closely with stakeholders to understand their reporting needs, deliver actionable insights, and ensure satisfaction. Utilize AI and machine learning techniques to enhance reporting solutions and provide predictive insights. Analyze complex datasets to identify trends, patterns, and anomalies that can inform business decisions. Ensure data integrity and accuracy in all reporting solutions. Provide training and support to team members and stakeholders on the use of reporting tools and AI technologies. Continuously seek opportunities to improve reporting processes and tools, staying updated with the latest industry trends and technologies. Communicate findings and recommendations to stakeholders through clear and concise presentations and reports. About you: You’re a fit for the role of Data Analyst if you: Bachelor’s degree in Data Science, Computer Science, Statistics, Business Analytics, or a related field. 3-6 years of experience as a Data Analyst or in a similar role, with a strong portfolio of reporting and dashboard projects. Proficiency in Power BI, Tableau, Python, Excel, and PowerPoint. Experience with AI technologies and machine learning algorithms is needed. Strong data analytical skills with the ability to collect, organize, analyze, and disseminate significant amounts of information with attention to detail and accuracy. Excellent communication and presentation skills. Ability to work collaboratively in a team environment as well as independently. Experience with programming languages such as Python or R. Familiarity with SQL for data extraction and manipulation. Knowledge of data warehousing, ETL processes, LLMs.. #LI-SS6 What’s in it For You? Hybrid Work Model: We’ve adopted a flexible hybrid working environment (2-3 days a week in the office depending on the role) for our office-based roles while delivering a seamless experience that is digitally and physically connected. Flexibility & Work-Life Balance: Flex My Way is a set of supportive workplace policies designed to help manage personal and professional responsibilities, whether caring for family, giving back to the community, or finding time to refresh and reset. This builds upon our flexible work arrangements, including work from anywhere for up to 8 weeks per year, empowering employees to achieve a better work-life balance. Career Development and Growth: By fostering a culture of continuous learning and skill development, we prepare our talent to tackle tomorrow’s challenges and deliver real-world solutions. Our Grow My Way programming and skills-first approach ensures you have the tools and knowledge to grow, lead, and thrive in an AI-enabled future. Industry Competitive Benefits: We offer comprehensive benefit plans to include flexible vacation, two company-wide Mental Health Days off, access to the Headspace app, retirement savings, tuition reimbursement, employee incentive programs, and resources for mental, physical, and financial wellbeing. Culture: Globally recognized, award-winning reputation for inclusion and belonging, flexibility, work-life balance, and more. We live by our values: Obsess over our Customers, Compete to Win, Challenge (Y)our Thinking, Act Fast / Learn Fast, and Stronger Together. Social Impact: Make an impact in your community with our Social Impact Institute. We offer employees two paid volunteer days off annually and opportunities to get involved with pro-bono consulting projects and Environmental, Social, and Governance (ESG) initiatives. Making a Real-World Impact: We are one of the few companies globally that helps its customers pursue justice, truth, and transparency. Together, with the professionals and institutions we serve, we help uphold the rule of law, turn the wheels of commerce, catch bad actors, report the facts, and provide trusted, unbiased information to people all over the world. About Us Thomson Reuters informs the way forward by bringing together the trusted content and technology that people and organizations need to make the right decisions. We serve professionals across legal, tax, accounting, compliance, government, and media. Our products combine highly specialized software and insights to empower professionals with the data, intelligence, and solutions needed to make informed decisions, and to help institutions in their pursuit of justice, truth, and transparency. Reuters, part of Thomson Reuters, is a world leading provider of trusted journalism and news. We are powered by the talents of 26,000 employees across more than 70 countries, where everyone has a chance to contribute and grow professionally in flexible work environments. At a time when objectivity, accuracy, fairness, and transparency are under attack, we consider it our duty to pursue them. Sound exciting? Join us and help shape the industries that move society forward. As a global business, we rely on the unique backgrounds, perspectives, and experiences of all employees to deliver on our business goals. To ensure we can do that, we seek talented, qualified employees in all our operations around the world regardless of race, color, sex/gender, including pregnancy, gender identity and expression, national origin, religion, sexual orientation, disability, age, marital status, citizen status, veteran status, or any other protected classification under applicable law. Thomson Reuters is proud to be an Equal Employment Opportunity Employer providing a drug-free workplace. We also make reasonable accommodations for qualified individuals with disabilities and for sincerely held religious beliefs in accordance with applicable law. More information on requesting an accommodation here. Learn more on how to protect yourself from fraudulent job postings here. More information about Thomson Reuters can be found on thomsonreuters.com.
Posted 1 day ago
6.0 years
10 Lacs
Hyderābād
On-site
Experience- 6+ years Work Mode- Hybrid Job Summary: We are seeking a skilled Informatica ETL Developer with 5+ years of experience in ETL and Business Intelligence projects. The ideal candidate will have a strong background in Informatica PowerCenter , a solid understanding of data warehousing concepts , and hands-on experience in SQL, performance tuning , and production support . This role involves designing and maintaining robust ETL pipelines to support digital transformation initiatives for clients in manufacturing, automotive, transportation, and engineering domains. Key Responsibilities: Design, develop, and maintain ETL workflows using Informatica PowerCenter . Troubleshoot and optimize ETL jobs for performance and reliability. Analyze complex data sets and write advanced SQL queries for data validation and transformation. Collaborate with data architects and business analysts to implement data warehousing solutions . Apply SDLC methodologies throughout the ETL development lifecycle. Support production environments by identifying and resolving data and performance issues. Work with Unix shell scripting for job automation and scheduling. Contribute to the design of technical architectures that support digital transformation. Required Skills: 3–5 years of hands-on experience with Informatica PowerCenter . Proficiency in SQL and familiarity with NoSQL platforms . Experience in ETL performance tuning and troubleshooting . Solid understanding of Unix/Linux environments and scripting. Excellent verbal and written communication skills. Preferred Qualifications: AWS Certification or experience with cloud-based data integration is a plus. Exposure to data modeling and data governance practices. Job Type: Full-time Pay: From ₹1,000,000.00 per year Location Type: In-person Schedule: Monday to Friday Ability to commute/relocate: Hyderabad, Telangana: Reliably commute or planning to relocate before starting work (Required) Application Question(s): What is your current CTC? What is your expected CTC? What is your current location? What is your notice period/ LWD? Are you comfortable attending L2 F2F interview in Hyderabad? Experience: Informatica powercenter: 5 years (Required) total work: 6 years (Required) Work Location: In person
Posted 1 day ago
3.0 - 6.0 years
0 Lacs
Hyderābād
Remote
Location : Hyderabad, India (Hybrid) This is a hybrid position primarily based in Hyderabad, India. We’re committed to your flexibility and wellbeing and our hybrid strategy currently requires three days a week in the office, giving you the option to work remotely for some of your working week. Find out more about our culture of flexible working . We give you a world of potential. Support is awesome in the way trust makes it work! When you join this dynamic team as a Software Engineer, you will enjoy a career, teamwork, flexibility, and leadership you can trust to help accelerate your personal and professional goals. Come be a part of a world of potential at Computershare Business Support Services. Corporate Trust is a market leader with decades of experience as a provider of trustee and sophisticated agency services for private and public companies, investment bankers, asset managers as well as governments and institutions. We offer a wide range of services that fulfil our clients with a best-in-class reputation built on our high-touch approach to client service we are looking for people to join us and be a part of our exciting future as one of the top corporate trust firms globally. A key part of this role will be collaborating with our onshore teams to service our Corporate Trust business lines and help us to deliver the professional services our clients trust and depend on. If you’re a match to those skills and have the passionate drive to be part of something truly amazing, while working on a diverse team and have the willingness to learn multiple tasks, then this is the perfect opportunity for you! A role you will love This role will work within an Agile environment to develop and support applications across the Computershare portfolio. This role will lead moderately complex initiatives and deliverables within technical domain. This role will work within cross-functional teams, this role requires strong technical skills, curiosity, a passion for delivering quality solutions and the drive to continually improve the quality and speed with which we deliver value to the business. This role will resolve moderately complex issues ad lead a team to meet existing client needs and/or potential clients needs while leveraging solid understanding of function, policies, procedures or compliance requirements. In Technology Services (CTS) we partner with our global businesses, providing technology services and IT support, designing, and developing new products to support our clients, customers, and employees. These business-aligned CIO teams leverage the expertise and capacity of enterprise-wide teams, such as the Digital Foundry, the Global Development team and many of our CTO teams. To continually improve our capabilities and speed to market, we have our own innovation, product management and manufacture practices and frameworks which are regularly refined. We ensure that colleagues around the world have access to the technology and agreed service levels that they need to take care of their clients and their clients’ shareholders, employees, and customers. Some of your key responsibilities will include: Apply knowledge of standards, policies, best practice and organizational structure so that you can work both independently and collaboratively within your team and with key stakeholders. Provide informal guidance and share knowledge with colleagues to enable them to contribute to the team’s objectives. Ensure the quality of tasks, services and information provided by your team – through the quality of your own work and the support you provide to others - to ensure that your team delivers high-quality, maintainable software which adheres to internal standards and policies Support the evaluation and resolution of technical challenges and blockers to minimize their impact on the team’s delivery and/or supported products. Identify and support improvements and innovation in technologies/practices within your team that would benefit the business e.g. efficiency in the software development process or improved customer experience. What will you bring to the role? We are a global business with an entrepreneurial spirit, and we are proud of that. What that comes with this is a fast-paced environment and lots of change so you will be resilient in nature and able to adapt quickly and embrace the pace of change we often work at. We are looking for people with these skills. Required overall – 3-6 years of exp in SSRS Development. Hands on experience in Database( oracle /SQL server) , Reporting Services (SSRS) Design and develop Oracle / SQL Server stored procedures, functions, views and triggers to be used during the ETL process Person should have core knowledge in SSRS reports and SQL server with creating complex stored procedures. He also should know crystal reports because to convert reports from crystal to SSRS. Should be very strong in Writing and creating SSRS repots and work independently. Should be very well versed with integrating Oracle and SSRS. Designing and developing SSIS / SQL ETL solutions to acquire and prepare data from numerous upstream systems for processing would be good to have skills Understands how to convert Crystal Reports to SSRS Debug and tune SSRS and suggests improvements Able to write and maintain database objects (tables, views, indexes, Proficient with MS SQL Server. Should be able to write queries and complex Stored Procedures with a keen eye for finding issues Test and prepare ETL processes for deployment to production and non-production environments Support system and acceptance testing including the development or refinement of test plans Good understanding of test automation Having exposure to PowerBI would be an added advanatage Collaborates and communicates well, builds great working relationships, influences others, challenges effectively and responds well to challenge from others, shares information and ideas with others, has good listening skills. Has a strong work ethic and is able to deal with sometimes conflicting priorities. Curious and continuous learner – investigates, interprets and grasps new concepts. Self-motivated and can use own initiative to work with limited guidance to implement innovative solutions. Pays attention to detail, finds root cause and takes a rigorous approach to problem solving. Rewards designed for you Health and wellbeing rewards that can be tailored to support you and your family. Save for your future. We will support you along your retirement savings journey. Paid parental leave , flexible working and a caring and inclusive culture. Income protection . To ease concerns when the unexpected occurs our package includes short and long-term disability benefits, life insurance, supplemental life insurance (single/spouse/family) and more. And more . Ours is a welcoming and close-knit community, with experienced colleagues ready to help you grow. Our careers hub will help you find out more about our rewards and life at Computershare, visit computershare.com/careershub . LI#DNP
Posted 1 day ago
15.0 years
0 Lacs
Hyderābād
On-site
Project Role : Application Developer Project Role Description : Design, build and configure applications to meet business process and application requirements. Must have skills : Ab Initio Good to have skills : NA Minimum 3 year(s) of experience is required Educational Qualification : 15 years full time education Summary: As an Application Developer, you will design, build, and configure applications to meet business process and application requirements. A typical day involves collaborating with team members to understand project needs, developing application features, and ensuring that the applications function seamlessly to support business operations. You will engage in problem-solving discussions and contribute innovative ideas to enhance application performance and user experience, all while adhering to project timelines and quality standards. Roles & Responsibilities: - Expected to perform independently and become an SME. - Required active participation/contribution in team discussions. - Contribute in providing solutions to work related problems. - Assist in the documentation of application processes and workflows. - Engage in code reviews to ensure best practices and quality standards are maintained. Professional & Technical Skills: - Must To Have Skills: Proficiency in Ab Initio. - Strong understanding of data integration and ETL processes. - Experience with application development lifecycle methodologies. - Familiarity with database management systems and SQL. - Ability to troubleshoot and resolve application issues efficiently. Additional Information: - The candidate should have minimum 3 years of experience in Ab Initio. - This position is based at our Hyderabad office. - A 15 years full time education is required. 15 years full time education
Posted 1 day ago
7.0 - 8.0 years
4 - 7 Lacs
Hyderābād
On-site
Location: Hyderabad, IN Employment type: Employee Place of work: Office Offshore/Onshore: Onshore TechnipFMC is committed to driving real change in the energy industry. Our ambition is to build a sustainable future through relentless innovation and global collaboration – and we want you to be part of it. You’ll be joining a culture that values curiosity, expertise, and ideas as well as diversity, inclusion, and authenticity. Bring your unique energy to our team of more than 20,000 people worldwide, and discover a rewarding, fulfilling, and varied career that you can take in anywhere you want to go. Job Purpose Data Analyst plays a crucial lead role in managing and optimizing business intelligence solutions using Power BI. Job Description Leadership and Strategy: Lead the design, development, and deployment of Power BI reports and dashboards. Provide strategic direction for data visualization and business intelligence initiatives. Interface with Business Owner, Project Manager, Planning Manager, Resource Managers etc. Develop roadmap for execution of complex data analytics projects. Data Modeling and Integration: Develop complex data models, establish relationships, and ensure data integrity. Oversee data integration from various sources. Advanced Analytics: Perform advanced data analysis using DAX (Data Analysis Expressions) and other analytical tools to derive insights and support decision-making. Collaboration: Work closely with stakeholders to gather requirements, define data needs, and ensure the delivery of high-quality BI solutions. Performance Optimization: Optimize solutions for performance, ensuring efficient data processing and report rendering. Mentorship: Mentor and guide junior developers, providing technical support and best practices for Power BI development. Data Security: Implement and maintain data security measures, ensuring compliance with data protection regulations. Demonstrated experience of leading complex projects with a team of varied experience levels. You are meant for this job if: Educational Background: Bachelor’s or Master’s degree in Computer Science, Information Systems, or a related field. Experience in working with unstructured data and data integration. Technical Skills: Proficiency in Power BI, DAX, SQL, and data modeling, exposure to data engineering. Experience with data integration tools and ETL processes. Hands-on experience with Snowflake Experience: 7-8 years of experience in business intelligence and data analytics, with a focus on Power BI. Soft Skills: Strong analytical and problem-solving skills, excellent communication abilities, and the capacity to lead and collaborate with global cross-functional teams. Skills Change Leadership Process Mapping Being a global leader in the energy industry requires an inclusive and diverse environment. TechnipFMC promotes diversity, equity, and inclusion by ensuring equal opportunities to all ages, races, ethnicities, religions, sexual orientations, gender expressions, disabilities, or all other pluralities. We celebrate who you are and what you bring. Every voice matters and we encourage you to add to our culture. TechnipFMC respects the rights and dignity of those it works with and promotes adherence to internationally recognized human rights principles for those in its value chain. Date posted: Jun 16, 2025 Requisition number: 13774
Posted 1 day ago
5.0 years
0 Lacs
Gurgaon
On-site
Job Description Alimentation Couche-Tard Inc., (ACT) is a global Fortune 200 company. A leader in the convenience store and fuel space with over 17,000 stores in 31 countries, serving more than 6 million customers each day It is an exciting time to be a part of the growing Data Engineering team at Circle K. We are driving a well-supported cloud-first strategy to unlock the power of data across the company and help teams to discover, value and act on insights from data across the globe. With our strong data pipeline, this position will play a key role partnering with our Technical Development stakeholders to enable analytics for long term success. About the role We are looking for a Senior Data Engineer with a collaborative, “can-do” attitude who is committed & strives with determination and motivation to make their team successful. A Sr. Data Engineer who has experience architecting and implementing technical solutions as part of a greater data transformation strategy. This role is responsible for hands on sourcing, manipulation, and delivery of data from enterprise business systems to data lake and data warehouse. This role will help drive Circle K’s next phase in the digital journey by modeling and transforming data to achieve actionable business outcomes. The Sr. Data Engineer will create, troubleshoot and support ETL pipelines and the cloud infrastructure involved in the process, will be able to support the visualizations team. Roles and Responsibilities Collaborate with business stakeholders and other technical team members to acquire and migrate data sources that are most relevant to business needs and goals. Demonstrate deep technical and domain knowledge of relational and non-relation databases, Data Warehouses, Data lakes among other structured and unstructured storage options. Determine solutions that are best suited to develop a pipeline for a particular data source. Develop data flow pipelines to extract, transform, and load data from various data sources in various forms, including custom ETL pipelines that enable model and product development. Efficient in ETL/ELT development using Azure cloud services and Snowflake, Testing and operation/support process (RCA of production issues, Code/Data Fix Strategy, Monitoring and maintenance). Work with modern data platforms including Snowflake to develop, test, and operationalize data pipelines for scalable analytics delivery. Provide clear documentation for delivered solutions and processes, integrating documentation with the appropriate corporate stakeholders. Identify and implement internal process improvements for data management (automating manual processes, optimizing data delivery, re-designing infrastructure for greater scalability). Stay current with and adopt new tools and applications to ensure high quality and efficient solutions. Build cross-platform data strategy to aggregate multiple sources and process development datasets. Proactive in stakeholder communication, mentor/guide junior resources by doing regular KT/reverse KT and help them in identifying production bugs/issues if needed and provide resolution recommendation. Job Requirements Bachelor’s Degree in Computer Engineering, Computer Science or related discipline, Master’s Degree preferred. 5+ years of ETL design, development, and performance tuning using ETL tools such as SSIS/ADF in a multi-dimensional Data Warehousing environment. 5+ years of experience with setting up and operating data pipelines using Python or SQL 5+ years of advanced SQL Programming: PL/SQL, T-SQL 5+ years of experience working with Snowflake, including Snowflake SQL, data modeling, and performance optimization. Strong hands-on experience with cloud data platforms such as Azure Synapse and Snowflake for building data pipelines and analytics workloads. 5+ years of strong and extensive hands-on experience in Azure, preferably data heavy / analytics applications leveraging relational and NoSQL databases, Data Warehouse and Big Data. 5+ years of experience with Azure Data Factory, Azure Synapse Analytics, Azure Analysis Services, Azure Databricks, Blob Storage, Databricks/Spark, Azure SQL DW/Synapse, and Azure functions. 5+ years of experience in defining and enabling data quality standards for auditing, and monitoring. Strong analytical abilities and a strong intellectual curiosity In-depth knowledge of relational database design, data warehousing and dimensional data modeling concepts Understanding of REST and good API design. Experience working with Apache Iceberg, Delta tables and distributed computing frameworks Strong collaboration and teamwork skills & excellent written and verbal communications skills. Self-starter and motivated with ability to work in a fast-paced development environment. Agile experience highly desirable. Proficiency in the development environment, including IDE, database server, GIT, Continuous Integration, unit-testing tool, and defect management tools. Knowledge Strong Knowledge of Data Engineering concepts (Data pipelines creation, Data Warehousing, Data Marts/Cubes, Data Reconciliation and Audit, Data Management). Strong working knowledge of Snowflake, including warehouse management, Snowflake SQL, and data sharing techniques. Experience building pipelines that source from or deliver data into Snowflake in combination with tools like ADF and Databricks. Working Knowledge of Dev-Ops processes (CI/CD), Git/Jenkins version control tool, Master Data Management (MDM) and Data Quality tools. Strong Experience in ETL/ELT development, QA and operation/support process (RCA of production issues, Code/Data Fix Strategy, Monitoring and maintenance). Hands on experience in Databases like (Azure SQL DB, MySQL/, Cosmos DB etc.), File system (Blob Storage), Python/Unix shell Scripting. ADF, Databricks and Azure certification is a plus. Technologies we use: Databricks, Azure SQL DW/Synapse, Azure Tabular, Azure Data Factory, Azure Functions, Azure Containers, Docker, DevOps, Python, PySpark, Scripting (Powershell, Bash), Git, Terraform, Power BI, Snowflake #LI-DS1
Posted 1 day ago
6.0 years
0 Lacs
Gurgaon
On-site
About Gartner IT : Join a world-class team of skilled engineers who build creative digital solutions to support our colleagues and clients. We make a broad organizational impact by delivering cutting-edge technology solutions that power Gartner. Gartner IT values its culture of nonstop innovation, an outcome-driven approach to success, and the notion that great ideas can come from anyone on the team. About this role: Senior Software Engineer will provide technical expertise in designing and building Master Data Management solutions or other Chief Data Office initiatives to meet the shifting organizational demands. This role will be responsible for building Master Data Management solution in Ataccama MDM platform to meet the shifting organizational demands. You will be part of the CDO execution Team to work on MDM program or warehouse. MDM brings data from multiple sources and enriches the information using validation/standardization and dedupe process. MDM is a centralized hub for contact and account domain across Gartner which standardizes and enriches information and shares across multiple systems within Gartner. Enrichment also includes to fetch latest and greatest data from multiple vendors and sharing information across systems within Gartner. What you’ll do: Responsible for reviewing and analysis of business requirements and design technical mapping document Build new processes in Ataccama Build new ETL jobs Help build defining best practices & processes Collaboration on Master Data Management, architecture and technical design discussions Build new ETL using Azure Data Factory and Synapse Perform and participate in code reviews, peer inspections and technical design and specifications, as well as document and review detailed designs Provide status reports to the higher management Maintain Service Levels and department goals for problem resolution. What you’ll need: Strong IT professional with 6+ years of experience in ETL, Master data Management solutions and Database Operations. The candidate should have strong analytical and problem-solving skills. Must have: Experience in Database Operations with a bachelor’s degree (Computer Science preferred). Understanding of data modelling Hands-on experience in MDM implementation using tools (Customer domain, product domain etc.) Ataccama preferred. Experience in ETL technology Experience in PL/SQL Experience in PostgreSQL and cloud databases Good exposure writing complex SQL Hands-on experience with Synapse Good exposure writing complex SQL Commitment to teamwork as a contributor Nice to have: Good knowledge in cloud technology and exposure in cloud tools Good understanding of business process and analyzing underlying data Experience with Python/Java programming language Experience with Synapse Experience with an Agile Methodology like Scrum Who are you: Bachelor’s degree or foreign equivalent degree in Computer Science or a related field required Excellent communication skills. Able to work independently or within a team proactively in a fast-paced AGILE-SCRUM environment. Owns success – Takes responsibility for the successful delivery of the solutions. Strong desire to improve upon their skills in tools and technologies Don’t meet every single requirement? We encourage you to apply anyway. You might just be the right candidate for this, or other roles. #LI-NS4 Who are we? At Gartner, Inc. (NYSE:IT), we guide the leaders who shape the world. Our mission relies on expert analysis and bold ideas to deliver actionable, objective insight, helping enterprise leaders and their teams succeed with their mission-critical priorities. Since our founding in 1979, we’ve grown to more than 21,000 associates globally who support ~14,000 client enterprises in ~90 countries and territories. We do important, interesting and substantive work that matters. That’s why we hire associates with the intellectual curiosity, energy and drive to want to make a difference. The bar is unapologetically high. So is the impact you can have here. What makes Gartner a great place to work? Our sustained success creates limitless opportunities for you to grow professionally and flourish personally. We have a vast, virtually untapped market potential ahead of us, providing you with an exciting trajectory long into the future. How far you go is driven by your passion and performance. We hire remarkable people who collaborate and win as a team. Together, our singular, unifying goal is to deliver results for our clients. Our teams are inclusive and composed of individuals from different geographies, cultures, religions, ethnicities, races, genders, sexual orientations, abilities and generations. We invest in great leaders who bring out the best in you and the company, enabling us to multiply our impact and results. This is why, year after year, we are recognized worldwide as a great place to work . What do we offer? Gartner offers world-class benefits, highly competitive compensation and disproportionate rewards for top performers. In our hybrid work environment, we provide the flexibility and support for you to thrive — working virtually when it's productive to do so and getting together with colleagues in a vibrant community that is purposeful, engaging and inspiring. Ready to grow your career with Gartner? Join us. The policy of Gartner is to provide equal employment opportunities to all applicants and employees without regard to race, color, creed, religion, sex, sexual orientation, gender identity, marital status, citizenship status, age, national origin, ancestry, disability, veteran status, or any other legally protected status and to seek to advance the principles of equal employment opportunity. Gartner is committed to being an Equal Opportunity Employer and offers opportunities to all job seekers, including job seekers with disabilities. If you are a qualified individual with a disability or a disabled veteran, you may request a reasonable accommodation if you are unable or limited in your ability to use or access the Company’s career webpage as a result of your disability. You may request reasonable accommodations by calling Human Resources at +1 (203) 964-0096 or by sending an email to ApplicantAccommodations@gartner.com . Job Requisition ID:101125 By submitting your information and application, you confirm that you have read and agree to the country or regional recruitment notice linked below applicable to your place of residence. Gartner Applicant Privacy Link: https://jobs.gartner.com/applicant-privacy-policy For efficient navigation through the application, please only use the back button within the application, not the back arrow within your browser.
Posted 1 day ago
7.0 - 9.0 years
0 Lacs
Gurgaon
On-site
Role Description: As a Technical Lead - Cloud Data Platform (AWS) at Incedo, you will be responsible for designing, deploying and maintaining cloud-based data platforms on the AWS platform. You will work with data engineers, data scientists and business analysts to understand business requirements and design scalable, reliable and cost effective solutions that meet those requirements. Roles & Responsibilities: Designing, developing and deploying cloud-based data platforms using Amazon Web Services (AWS) Integrating and processing large amounts of structured and unstructured data from various sources Implementing and optimizing ETL processes and data pipelines Developing and maintaining security and access controls Collaborating with other teams to ensure the consistency and integrity of data Troubleshooting and resolving data platform issues Technical Skills Skills Requirements: In-depth knowledge of AWS services and tools such as AWS Glue, AWS Redshift, and AWS Lambda Experience in building scalable and reliable data pipelines using AWS services, Apache Spark, and related big data technologies Familiarity with cloud-based infrastructure and deployment, specifically on AWS Strong knowledge of programming languages such as Python, Java, and SQL Must have excellent communication skills and be able to communicate complex technical information to non-technical stakeholders in a clear and concise manner. Must understand the company's long-term vision and align with it. Should be open to new ideas and be willing to learn and develop new skills. Should also be able to work well under pressure and manage multiple tasks and priorities Qualifications Qualifications 7-9 years of work experience in relevant field B.Tech/B.E/M.Tech or MCA degree from a reputed university. Computer science background is preferred Job Types: Full-time, Permanent Schedule: Day shift Morning shift Work Location: In person
Posted 1 day ago
5.0 years
4 - 8 Lacs
Gurgaon
On-site
Job details Employment Type: Full-Time Location: Gurgaon, Sector, India Job Category: Innovation & Technology Job Number: WD30242868 Job Description Who we are? Johnson Controls is the global leader for smart, healthy and sustainable buildings. At Johnson Controls, we’ve been making buildings smarter since 1885, and our capabilities, depth of innovation experience, and global reach have been growing ever since. Today, we offer the world’s largest portfolio of building products, technologies, software, and services; we put that portfolio to work to transform the environments where people live, work, learn and play. This is where Johnson Controls comes in, helping drive the outcomes that matter most. Through a full range of systems and digital solutions, we make your buildings smarter. A smarter building is safer, more comfortable, more efficient, and, ultimately, more sustainable. Most important, smarter buildings let you focus more intensely on your unique mission. Better for your people. Better for your bottom line. Better for the planet. We’re helping to create a healthy planet with solutions that decrease energy use, reduce waste and make carbon neutrality a reality. Sustainability is a top priority for our company. We committed to invest 75 percent of new product development R&D in climate-related innovation to develop sustainable products and services. We take sustainability seriously. Achieving net zero carbon emissions before 2040 is just one of our commitments to making the world a better place. Please visit and follow Johnson Controls LinkedIn for recent exciting activities. Why JCI: https://www.youtube.com/watch?v=nrbigjbpxkg Asia-Pacific LinkedIn: https://www.linkedin.com/showcase/johnson-controls-asia-pacific/posts/?feedView=all Career: The Power Behind Your Mission OpenBlue: This is How a Space Comes Alive How will you do it? Solution Architecture Design: Design scalable and efficient data architectures using Snowflake that meet business needs and best practices Implementation: Lead the deployment of Snowflake solutions, including data ingestion, transformation, and visualization processes Data Governance & Security: Ensuring compliance with global data regulations in accordance with the data strategy and cybersecurity initiatives Collaboration: Work closely with data engineers, data scientists, and business stakeholders to gather requirements and provide technical guidance Optimization: Monitor and optimize performance, storage, and cost of Snowflake environments, implementing best practices for data modeling and querying Integration: Integrate Snowflake with other cloud services and tools (e.g., ETL/ELT tools, BI tools, data lakes) to create seamless data workflows Documentation: Create and maintain documentation for architecture designs, data models, and operational procedures Training and Support: Provide training and support to teams on Snowflake usage and best practices Troubleshooting: Identify and resolve issues related to Snowflake performance, security, and data integrity Stay Updated: Keep abreast of Snowflake updates, new features, and industry trends to continually enhance solutions and methodologies Assist Data Architects in implementing Snowflake-based data warehouse solutions to support advanced analytics and reporting use cases What we look for? Minimum: Bachelor’s / Postgraduate/ Master’s Degree in any stream Minimum 5 years of relevant experience as Solutions Architect, Data Architect, or similar role Knowledge of Snowflake Data warehouse and understanding the concepts of data warehousing including ELT, ETL processes and data modelling Understanding of cloud platforms (AWS, Azure, GCP) and their integration with Snowflake Competency in data preparation and/or ETL tools to build and maintain data pipelines and flows Strong knowledge of databases, stored procedures(SPs) and optimization of large data sets SQL, Power BI/Tableau is mandatory along with knowledge of any data integration tool Excellent communication and collaboration skills Strong problem-solving abilities and analytical mindset Ability to work in a fast-paced, dynamic environment What we offer: We offer an exciting and challenging position. Joining us you will become part of a leading global multi-industrial corporation defined by its stimulating work environment and job satisfaction. In addition, we offer outstanding career development opportunities which will stretch your abilities and channel your talents Diversity & Inclusion Our dedication to diversity and inclusion starts with our values. We lead with integrity and purpose, focusing on the future and aligning with our customers’ vision for success. Our High-Performance Culture ensures that we have the best talent that is highly engaged and eager to innovate. Our D&I mission elevates each employee’s responsibility to contribute to our culture. It’s through these contributions that we’ll drive the mindsets and behaviors we need to power our customers’ missions. You have the power. You have the voice. You have the culture in your hands
Posted 1 day ago
8.0 years
0 Lacs
Gurgaon
On-site
Project description We are seeking an experienced Senior Project Manager with a strong background in delivering data engineering and Python-based development projects. In this role, you will manage cross-functional teams and lead Agile delivery for high-impact, cloud-based data initiatives. You'll work closely with data engineers, scientists, architects, and business stakeholders to ensure projects are delivered on time, within scope, and aligned with strategic objectives. The ideal candidate combines technical fluency, strong leadership, and Agile delivery expertise in data-centric environments. Responsibilities Lead and manage data engineering and Python-based development projects, ensuring timely delivery and alignment with business goals. Work closely with data engineers, data scientists, architects, and product owners to gather requirements and define project scope. Translate complex technical requirements into actionable project plans and user stories. Oversee sprint planning, backlog grooming, daily stand-ups, and retrospectives in Agile/Scrum environments. Ensure best practices in Python coding, data pipeline design, and cloud-based data architecture are followed. Identify and mitigate risks, manage dependencies, and escalate issues when needed. Own stakeholder communications, reporting, and documentation of all project artifacts. Track KPIs and delivery metrics to ensure accountability and continuous improvement. Skills Must have Experience: Minimum 8+ years of project management experience, including 3+ years managing data and Python-based development projects. Agile Expertise: Strong experience delivering projects in Agile/Scrum environments with distributed or hybrid teams. Technical Fluency: Solid understanding of Python, data pipelines, and ETL/ELT workflows. Familiarity with cloud platforms such as AWS, Azure, or GCP. Exposure to tools like Airflow, dbt, Spark, Databricks, or Snowflake is a plus. Tools: Proficiency with JIRA, Confluence, Git, and project dashboards (e.g., Power BI, Tableau). Soft Skills: Strong communication, stakeholder management, and leadership skills. Ability to translate between technical and non-technical audiences. Skilled in risk management, prioritization, and delivery tracking. Nice to have N/A Other Languages English: C1 Advanced Seniority Senior Gurugram, India Req. VR-115111 Technical Project Management BCM Industry 16/06/2025 Req. VR-115111
Posted 1 day ago
8.0 years
3 - 8 Lacs
Gurgaon
On-site
Date: Jun 5, 2025 Job Requisition Id: 61535 Location: Gurgaon, IN YASH Technologies is a leading technology integrator specializing in helping clients reimagine operating models, enhance competitiveness, optimize costs, foster exceptional stakeholder experiences, and drive business transformation. At YASH, we’re a cluster of the brightest stars working with cutting-edge technologies. Our purpose is anchored in a single truth – bringing real positive changes in an increasingly virtual world and it drives us beyond generational gaps and disruptions of the future. We are looking forward to hire Microsoft Fabric Professionals in the following areas : Experience 8+ Years Job Description Position: Data Analytics Lead. Experience: 8+ Years. Responsibilities: Build, manage, and foster a high-functioning team of data engineers and Data analysts. Collaborate with business and technical teams to capture and prioritize platform ingestion requirements. Experience of working with manufacturing industry in building a centralized data platform for self service reporting. Lead the data analytics team members, providing guidance, mentorship, and support to ensure their professional growth and success. Responsible for managing customer, partner, and internal data on the cloud and on-premises. Evaluate and understand current data technologies and trends and promote a culture of learning. Build and end to end data strategy from collecting the requirements from business to modelling the data and building reports and dashboards Required Skills: Experience in data engineering and architecture, with a focus on developing scalable cloud solutions in Azure Synapse / Microsoft Fabric / Azure Databricks Accountable for the data group’s activities including architecting, developing, and maintaining a centralized data platform including our operational data, data warehouse, data lake, Data factory pipelines, and data-related services. Experience in designing and building operationally efficient pipelines, utilising core Azure components, such as Azure Data Factory, Azure Databricks and Pyspark etc Strong understanding of data architecture, data modelling, and ETL processes. Proficiency in SQL and Pyspark Strong knowledge of building PowerBI reports and dashboards. Excellent communication skills Strong problem-solving and analytical skills. Required Technical/ Functional Competencies Domain/ Industry Knowledge: Basic knowledge of customer's business processes- relevant technology platform or product. Able to prepare process maps, workflows, business cases and simple business models in line with customer requirements with assistance from SME and apply industry standards/ practices in implementation with guidance from experienced team members. Requirement Gathering and Analysis: Working knowledge of requirement management processes and requirement analysis processes, tools & methodologies. Able to analyse the impact of change requested/ enhancement/ defect fix and identify dependencies or interrelationships among requirements & transition requirements for engagement. Product/ Technology Knowledge: Working knowledge of technology product/platform standards and specifications. Able to implement code or configure/customize products and provide inputs in design and architecture adhering to industry standards/ practices in implementation. Analyze various frameworks/tools, review the code and provide feedback on improvement opportunities. Architecture tools and frameworks: Working knowledge of architecture Industry tools & frameworks. Able to identify pros/ cons of available tools & frameworks in market and use those as per Customer requirement and explore new tools/ framework for implementation. Architecture concepts and principles : Working knowledge of architectural elements, SDLC, methodologies. Able to provides architectural design/ documentation at an application or function capability level and implement architectural patterns in solution & engagements and communicates architecture direction to the business. Analytics Solution Design: Knowledge of statistical & machine learning techniques like classification, linear regression modelling, clustering & decision trees. Able to identify the cause of errors and their potential solutions. Tools & Platform Knowledge: Familiar with wide range of mainstream commercial & open-source data science/analytics software tools, their constraints, advantages, disadvantages, and areas of application. Required Behavioral Competencies Accountability: Takes responsibility for and ensures accuracy of own work, as well as the work and deadlines of the team. Collaboration: Shares information within team, participates in team activities, asks questions to understand other points of view. Agility: Demonstrates readiness for change, asking questions and determining how changes could impact own work. Customer Focus: Identifies trends and patterns emerging from customer preferences and works towards customizing/ refining existing services to exceed customer needs and expectations. Communication: Targets communications for the appropriate audience, clearly articulating and presenting his/her position or decision. Drives Results: Sets realistic stretch goals for self & others to achieve and exceed defined goals/targets. Resolves Conflict: Displays sensitivity in interactions and strives to understand others’ views and concerns. Certifications Mandatory At YASH, you are empowered to create a career that will take you to where you want to go while working in an inclusive team environment. We leverage career-oriented skilling models and optimize our collective intelligence aided with technology for continuous learning, unlearning, and relearning at a rapid pace and scale. Our Hyperlearning workplace is grounded upon four principles Flexible work arrangements, Free spirit, and emotional positivity Agile self-determination, trust, transparency, and open collaboration All Support needed for the realization of business goals, Stable employment with a great atmosphere and ethical corporate culture
Posted 1 day ago
7.0 years
7 - 7 Lacs
Gurgaon
On-site
Engineer III, Database Engineering Gurgaon, India; Hyderabad, India Information Technology 316332 Job Description About The Role: Grade Level (for internal use): 10 Role: As a Senior Database Engineer, you will work on multiple datasets that will enable S&P CapitalIQ Pro to serve-up value-added Ratings, Research and related information to the Institutional clients. The Team: Our team is responsible for the gathering data from multiple sources spread across the globe using different mechanism (ETL/GG/SQL Rep/Informatica/Data Pipeline) and convert them to a common format which can be used by Client facing UI tools and other Data providing Applications. This application is the backbone of many of S&P applications and is critical to our client needs. You will get to work on wide range of technologies and tools like Oracle/SQL/.Net/Informatica/Kafka/Sonic. You will have the opportunity every day to work with people from a wide variety of backgrounds and will be able to develop a close team dynamic with coworkers from around the globe. We craft strategic implementations by using the broader capacity of the data and product. Do you want to be part of a team that executes cross-business solutions within S&P Global? Impact: Our Team is responsible to deliver essential and business critical data with applied intelligence to power the market of the future. This enables our customer to make decisions with conviction. Contribute significantly to the growth of the firm by- Developing innovative functionality in existing and new products Supporting and maintaining high revenue productionized products Achieve the above intelligently and economically using best practices Career: This is the place to hone your existing Database skills while having the chance to become exposed to fresh technologies. As an experienced member of the team, you will have the opportunity to mentor and coach developers who have recently graduated and collaborate with developers, business analysts and product managers who are experts in their domain. Your skills: You should be able to demonstrate that you have an outstanding knowledge and hands-on experience in the below areas: Complete SDLC: architecture, design, development and support of tech solutions Play a key role in the development team to build high-quality, high-performance, scalable code Engineer components, and common services based on standard corporate development models, languages and tools Produce technical design documents and conduct technical walkthroughs Collaborate effectively with technical and non-technical stakeholders Be part of a culture to continuously improve the technical design and code base Document and demonstrate solutions using technical design docs, diagrams and stubbed code Our Hiring Manager says: I’m looking for a person that gets excited about technology and motivated by seeing how our individual contribution and team work to the world class web products affect the workflow of thousands of clients resulting in revenue for the company. Qualifications Required: Bachelor’s degree in computer science, Information Systems or Engineering. 7+ years of experience on Transactional Databases like SQL server, Oracle, PostgreSQL and other NoSQL databases like Amazon DynamoDB, MongoDB Strong Database development skills on SQL Server, Oracle Strong knowledge of Database architecture, Data Modeling and Data warehouse. Knowledge on object-oriented design, and design patterns. Familiar with various design and architectural patterns Strong development experience with Microsoft SQL Server Experience in cloud native development and AWS is a big plus Experience with Kafka/Sonic Broker messaging systems Nice to have: Experience in developing data pipelines using Java or C# is a significant advantage. Strong knowledge around ETL Tools – Informatica, SSIS Exposure with Informatica is an advantage. Familiarity with Agile and Scrum models Working Knowledge of VSTS. Working knowledge of AWS cloud is an added advantage. Understanding of fundamental design principles for building a scalable system. Understanding of financial markets and asset classes like Equity, Commodity, Fixed Income, Options, Index/Benchmarks is desirable. Additionally, experience with Scala, Python and Spark applications is a plus. About S&P Global Market Intelligence At S&P Global Market Intelligence, a division of S&P Global we understand the importance of accurate, deep and insightful information. Our team of experts delivers unrivaled insights and leading data and technology solutions, partnering with customers to expand their perspective, operate with confidence, and make decisions with conviction. For more information, visit www.spglobal.com/marketintelligence. What’s In It For You? Our Purpose: Progress is not a self-starter. It requires a catalyst to be set in motion. Information, imagination, people, technology–the right combination can unlock possibility and change the world. Our world is in transition and getting more complex by the day. We push past expected observations and seek out new levels of understanding so that we can help companies, governments and individuals make an impact on tomorrow. At S&P Global we transform data into Essential Intelligence®, pinpointing risks and opening possibilities. We Accelerate Progress. Our People: We're more than 35,000 strong worldwide—so we're able to understand nuances while having a broad perspective. Our team is driven by curiosity and a shared belief that Essential Intelligence can help build a more prosperous future for us all. From finding new ways to measure sustainability to analyzing energy transition across the supply chain to building workflow solutions that make it easy to tap into insight and apply it. We are changing the way people see things and empowering them to make an impact on the world we live in. We’re committed to a more equitable future and to helping our customers find new, sustainable ways of doing business. We’re constantly seeking new solutions that have progress in mind. Join us and help create the critical insights that truly make a difference. Our Values: Integrity, Discovery, Partnership At S&P Global, we focus on Powering Global Markets. Throughout our history, the world's leading organizations have relied on us for the Essential Intelligence they need to make confident decisions about the road ahead. We start with a foundation of integrity in all we do, bring a spirit of discovery to our work, and collaborate in close partnership with each other and our customers to achieve shared goals. Benefits: We take care of you, so you can take care of business. We care about our people. That’s why we provide everything you—and your career—need to thrive at S&P Global. Our benefits include: Health & Wellness: Health care coverage designed for the mind and body. Flexible Downtime: Generous time off helps keep you energized for your time on. Continuous Learning: Access a wealth of resources to grow your career and learn valuable new skills. Invest in Your Future: Secure your financial future through competitive pay, retirement planning, a continuing education program with a company-matched student loan contribution, and financial wellness programs. Family Friendly Perks: It’s not just about you. S&P Global has perks for your partners and little ones, too, with some best-in class benefits for families. Beyond the Basics: From retail discounts to referral incentive awards—small perks can make a big difference. For more information on benefits by country visit: https://spgbenefits.com/benefit-summaries Global Hiring and Opportunity at S&P Global: At S&P Global, we are committed to fostering a connected and engaged workplace where all individuals have access to opportunities based on their skills, experience, and contributions. Our hiring practices emphasize fairness, transparency, and merit, ensuring that we attract and retain top talent. By valuing different perspectives and promoting a culture of respect and collaboration, we drive innovation and power global markets. Recruitment Fraud Alert: If you receive an email from a spglobalind.com domain or any other regionally based domains, it is a scam and should be reported to reportfraud@spglobal.com. S&P Global never requires any candidate to pay money for job applications, interviews, offer letters, “pre-employment training” or for equipment/delivery of equipment. Stay informed and protect yourself from recruitment fraud by reviewing our guidelines, fraudulent domains, and how to report suspicious activity here. - Equal Opportunity Employer S&P Global is an equal opportunity employer and all qualified candidates will receive consideration for employment without regard to race/ethnicity, color, religion, sex, sexual orientation, gender identity, national origin, age, disability, marital status, military veteran status, unemployment status, or any other status protected by law. Only electronic job submissions will be considered for employment. If you need an accommodation during the application process due to a disability, please send an email to: EEO.Compliance@spglobal.com and your request will be forwarded to the appropriate person. US Candidates Only: The EEO is the Law Poster http://www.dol.gov/ofccp/regs/compliance/posters/pdf/eeopost.pdf describes discrimination protections under federal law. Pay Transparency Nondiscrimination Provision - https://www.dol.gov/sites/dolgov/files/ofccp/pdf/pay-transp_%20English_formattedESQA508c.pdf - 20 - Professional (EEO-2 Job Categories-United States of America), IFTECH202.1 - Middle Professional Tier I (EEO Job Group) Job ID: 316332 Posted On: 2025-06-16 Location: Gurgaon, Haryana, India
Posted 1 day ago
Upload Resume
Drag or click to upload
Your data is secure with us, protected by advanced encryption.
The ETL (Extract, Transform, Load) job market in India is thriving with numerous opportunities for job seekers. ETL professionals play a crucial role in managing and analyzing data effectively for organizations across various industries. If you are considering a career in ETL, this article will provide you with valuable insights into the job market in India.
These cities are known for their thriving tech industries and often have a high demand for ETL professionals.
The average salary range for ETL professionals in India varies based on experience levels. Entry-level positions typically start at around ₹3-5 lakhs per annum, while experienced professionals can earn upwards of ₹10-15 lakhs per annum.
In the ETL field, a typical career path may include roles such as: - Junior ETL Developer - ETL Developer - Senior ETL Developer - ETL Tech Lead - ETL Architect
As you gain experience and expertise, you can progress to higher-level roles within the ETL domain.
Alongside ETL, professionals in this field are often expected to have skills in: - SQL - Data Warehousing - Data Modeling - ETL Tools (e.g., Informatica, Talend) - Database Management Systems (e.g., Oracle, SQL Server)
Having a strong foundation in these related skills can enhance your capabilities as an ETL professional.
Here are 25 interview questions that you may encounter in ETL job interviews:
As you explore ETL jobs in India, remember to showcase your skills and expertise confidently during interviews. With the right preparation and a solid understanding of ETL concepts, you can embark on a rewarding career in this dynamic field. Good luck with your job search!
Browse through a variety of job opportunities tailored to your skills and preferences. Filter by location, experience, salary, and more to find your perfect fit.
We have sent an OTP to your contact. Please enter it below to verify.