Get alerts for new jobs matching your selected skills, preferred locations, and experience range. Manage Job Alerts
10.0 years
0 Lacs
Chennai, Tamil Nadu, India
Remote
🚀 We’re Hiring: Senior Data Engineer (Remote – India | Full-time) We are helping our client hire a Senior Data Engineer with over 10 years of experience in modern data platforms. This is a remote role open across India , and available on both full-time and contract basis. 💼 Position: Senior Data Engineer 🌍 Location: Remote (India) 📅 Type: Full-Time / Contract 📊 Experience: 10+ Years 🔧 Must-Have Skills: Data Engineering, Data Warehousing, ETL Azure Databricks & Azure Data Factory (ADF) PySpark, SparkSQL Python, SQL 👀 What We’re Looking For: A strong background in building and managing data pipelines Hands-on experience in cloud platforms, especially Azure Ability to work independently and collaborate in distributed teams 📩 How to Apply: Please send your resume to connect@infosprucetech.com with the subject line: "Senior Data Engineer – Remote India" ⚠️ Along with your resume, kindly include the following details: Full Name Mobile Number Total Experience Relevant Experience Current CTC Expected CTC Notice Period Current Location Do you have a PF account? (Yes/No) #DataEngineer #AzureDatabricks #ADF #PySpark #SQL #RemoteJobsIndia #HiringNow #Strive4X #FullTimeJobs #IndiaJobs
Posted 3 weeks ago
10.0 - 14.0 years
0 Lacs
pune, maharashtra
On-site
You will be responsible for designing, configuring, and implementing system enhancements to improve HR operations. Your role will involve developing and testing system modifications, reports, and integrations based on business needs. It will be crucial to ensure best practices and compliance with Oracle HCM standards and HR policies. To qualify for this position, you should have at least 10 years of relevant professional experience and hold a Bachelor's degree in computer science, information systems, software engineering, or a related field. Additionally, you must possess strong experience in implementing Fusion Applications, with a minimum of 3 full cycles of successful implementations. You are expected to demonstrate a solid understanding of the Fusion quarterly update process and adhere to best practices for new feature adoption, testing, and change management. Proficiency in roles and security within Oracle HCM is essential. Excellent analytical and debugging skills are required for effective problem-solving in the Fusion Cloud environment. You should also have a good command of SQL, PL/SQL, and the ability to create reports in OTBI using SQL queries. Your role may involve participation in Pre-Sales activities and the drafting of technical proposals. Extensive experience in developing OAF extensions is a mandatory requirement. Furthermore, you should be capable of designing and developing customizations using Visual Builder, ADF, and Process Builder in OIC for Oracle ERP Cloud.,
Posted 3 weeks ago
5.0 - 9.0 years
0 Lacs
haryana
On-site
Genpact is a global professional services and solutions firm focused on delivering outcomes that shape the future. With over 125,000 employees in more than 30 countries, we are driven by curiosity, agility, and the desire to create lasting value for our clients. Our purpose is the relentless pursuit of a world that works better for people, serving and transforming leading enterprises, including Fortune Global 500 companies, through deep business and industry knowledge, digital operations services, and expertise in data, technology, and AI. We are currently seeking applications for the position of Lead Consultant-Databricks Developer - AWS. As a Databricks Developer in this role, you will be responsible for solving cutting-edge real-world problems to meet both functional and non-functional requirements. Responsibilities: - Stay updated on new and emerging technologies and explore their potential applications for service offerings and products. - Collaborate with architects and lead engineers to design solutions that meet functional and non-functional requirements. - Demonstrate knowledge of relevant industry trends and standards. - Showcase strong analytical and technical problem-solving skills. - Possess excellent coding skills, particularly in Python or Scala, with a preference for Python. Qualifications: Minimum qualifications: - Bachelor's Degree in CS, CE, CIS, IS, MIS, or an engineering discipline, or equivalent work experience. - Stay informed about new technologies and their potential applications. - Collaborate with architects and lead engineers to develop solutions. - Demonstrate knowledge of industry trends and standards. - Exhibit strong analytical and technical problem-solving skills. - Proficient in Python or Scala coding. - Experience in the Data Engineering domain. - Completed at least 2 end-to-end projects in Databricks. Additional qualifications: - Familiarity with Delta Lake, dbConnect, db API 2.0, and Databricks workflows orchestration. - Understanding of Databricks Lakehouse concept and its implementation in enterprise environments. - Ability to create complex data pipelines. - Strong knowledge of Data structures & algorithms. - Proficiency in SQL and Spark-SQL. - Experience in performance optimization to enhance efficiency and reduce costs. - Worked on both Batch and streaming data pipelines. - Extensive knowledge of Spark and Hive data processing framework. - Experience with cloud platforms (Azure, AWS, GCP) and common services like ADLS/S3, ADF/Lambda, CosmosDB/DynamoDB, ASB/SQS, Cloud databases. - Skilled in writing unit and integration test cases. - Excellent communication skills and experience working in teams of 5 or more. - Positive attitude towards learning new skills and upskilling. - Knowledge of Unity catalog and basic governance. - Understanding of Databricks SQL Endpoint. - Experience in CI/CD to build pipelines for Databricks jobs. - Exposure to migration projects for building Unified data platforms. - Familiarity with DBT, Docker, and Kubernetes. This is a full-time position based in India-Gurugram. The job posting was on August 5, 2024, and the unposting date is set for October 4, 2024.,
Posted 3 weeks ago
5.0 years
0 Lacs
Hyderabad, Telangana, India
On-site
Summary Position Summary Data Analytics Engineer – CL4 Role Overview : As a Data Analytics Engineer , you will actively engage in your engineering craft, taking a hands-on approach to multiple high-visibility projects. Your expertise will be pivotal in delivering solutions that delight customers and users, while also driving tangible value for Deloitte's business investments. You will leverage your extensive engineering craftmanship across multiple programming languages and modern frameworks, consistently demonstrating your strong track record in delivering high-quality, outcome-focused solutions. The ideal candidate will be a dependable team player, collaborating with cross-functional teams to design, develop, and deploy advanced software solutions. Key Responsibilities : Outcome-Driven Accountability: Embrace and drive a culture of accountability for customer and business outcomes. Develop engineering solutions that solve complex problems with valuable outcomes, ensuring high-quality, lean designs and implementations. Technical Leadership and Advocacy: Serve as the technical advocate for products, ensuring code integrity, feasibility, and alignment with business and customer goals. Lead requirement analysis, component design, development, unit testing, integrations, and support. Engineering Craftsmanship: Maintain accountability for the integrity of code design, implementation, quality, data, and ongoing maintenance and operations. Stay hands-on, self-driven, and continuously learn new approaches, languages, and frameworks. Create technical specifications, and write high-quality, supportable, scalable code ensuring all quality KPIs are met or exceeded. Demonstrate collaborative skills to work effectively with diverse teams. Customer-Centric Engineering: Develop lean engineering solutions through rapid, inexpensive experimentation to solve customer needs. Engage with customers and product teams before, during, and after delivery to ensure the right solution is delivered at the right time. Incremental and Iterative Delivery: Adopt a mindset that favors action and evidence over extensive planning. Utilize a learning-forward approach to navigate complexity and uncertainty, delivering lean, supportable, and maintainable solutions. Cross-Functional Collaboration and Integration: Work collaboratively with empowered, cross-functional teams including product management, experience, and delivery. Integrate diverse perspectives to make well-informed decisions that balance feasibility, viability, usability, and value. Foster a collaborative environment that enhances team synergy and innovation. Advanced Technical Proficiency: Possess deep expertise in modern software engineering practices and principles, including Agile methodologies and DevSecOps to deliver daily product deployments using full automation from code check-in to production with all quality checks through SDLC lifecycle. Strive to be a role model, leveraging these techniques to optimize solutioning and product delivery. Demonstrate understanding of the full lifecycle product development, focusing on continuous improvement, and learning. Domain Expertise: Quickly acquire domain-specific knowledge relevant to the business or product. Translate business/user needs, architectures, and data designs into technical specifications and code. Be a valuable, flexible, and dedicated team member, supportive of teammates, and focused on quality and tech debt payoff. Effective Communication and Influence: Exhibit exceptional communication skills, capable of articulating complex technical concepts clearly and compellingly. Inspire and influence teammates and product teams through well-structured arguments and trade-offs supported by evidence. Create coherent narratives that align technical solutions with business objectives. Engagement and Collaborative Co-Creation: Engage and collaborate with product engineering teams at all organizational levels, including customers as needed. Build and maintain constructive relationships, fostering a culture of co-creation and shared momentum towards achieving product goals. Align diverse perspectives and drive consensus to create feasible solutions. The team : US Deloitte Technology Product Engineering has modernized software and product delivery, creating a scalable, cost-effective model that focuses on value/outcomes that leverages a progressive and responsive talent structure. As Deloitte’s primary internal development team, Product Engineering delivers innovative digital solutions to businesses, service lines, and internal operations with proven bottom-line results and outcomes. It helps power Deloitte’s success. It is the engine that drives Deloitte, serving many of the world’s largest, most respected companies. We develop and deploy cutting-edge internal and go-to-market solutions that help Deloitte operate effectively and lead in the market. Our reputation is built on a tradition of delivering with excellence. Key Qualifications : A bachelor’s degree in computer science, software engineering, or a related discipline. An advanced degree (e.g., MS) is preferred but not required. Experience is the most relevant factor. Strong data engineering foundation with deep understanding of data-structure, algorithms, code instrumentations, etc. 5+ years proven experience with data ETL and ELT tools (such as ADF, Alteryx, cloud-native tools), data warehousing tools (such as SAP HANA, Snowflake, ADLS, Amazon Redshift, Google Cloud BigQuery). 5+ years of experience with cloud-native engineering, using FaaS/PaaS/micro-services on cloud hyper-scalers like Azure, AWS, and GCP. Strong understanding of methodologies & tools like, XP, Lean, SAFe, DevSecOps, SRE, ADO, GitHub, SonarQube, etc. Strong preference will be given to candidates with experience in AI/ML and GenAI. Excellent interpersonal and organizational skills, with the ability to handle diverse situations, complex projects, and changing priorities, behaving with passion, empathy, and care. How You will Grow: At Deloitte, our professional development plans focus on helping people at every level of their career to identify and use their strengths to do their best work every day and excel in everything they do. Our purpose Deloitte’s purpose is to make an impact that matters for our people, clients, and communities. At Deloitte, purpose is synonymous with how we work every day. It defines who we are. Our purpose comes through in our work with clients that enables impact and value in their organizations, as well as through our own investments, commitments, and actions across areas that help drive positive outcomes for our communities. Our people and culture Our inclusive culture empowers our people to be who they are, contribute their unique perspectives, and make a difference individually and collectively. It enables us to leverage different ideas and perspectives, and bring more creativity and innovation to help solve our clients' most complex challenges. This makes Deloitte one of the most rewarding places to work. Professional development At Deloitte, professionals have the opportunity to work with some of the best and discover what works best for them. Here, we prioritize professional growth, offering diverse learning and networking opportunities to help accelerate careers and enhance leadership skills. Our state-of-the-art DU: The Leadership Center in India, located in Hyderabad, represents a tangible symbol of our commitment to the holistic growth and development of our people. Explore DU: The Leadership Center in India . Benefits To Help You Thrive At Deloitte, we know that great people make a great organization. Our comprehensive rewards program helps us deliver a distinctly Deloitte experience that helps that empowers our professionals to thrive mentally, physically, and financially—and live their purpose. To support our professionals and their loved ones, we offer a broad range of benefits. Eligibility requirements may be based on role, tenure, type of employment and/ or other criteria. Learn more about what working at Deloitte can mean for you. Recruiting tips From developing a stand out resume to putting your best foot forward in the interview, we want you to feel prepared and confident as you explore opportunities at Deloitte. Check out recruiting tips from Deloitte recruiters. Requisition code: 306372
Posted 3 weeks ago
3.0 years
0 Lacs
Hyderabad, Telangana, India
On-site
Summary Position Summary Data Analytics Engineer – CL3 Role Overview : As a Data Analytics Engineer , you will actively engage in your engineering craft, taking a hands-on approach to multiple high-visibility projects. Your expertise will be pivotal in delivering solutions that delight customers and users, while also driving tangible value for Deloitte's business investments. You will leverage your extensive engineering craftmanship across multiple programming languages and modern frameworks, consistently demonstrating your strong track record in delivering high-quality, outcome-focused solutions. The ideal candidate will be a dependable team player, collaborating with cross-functional teams to design, develop, and deploy advanced software solutions. Key Responsibilities : Outcome-Driven Accountability: Embrace and drive a culture of accountability for customer and business outcomes. Develop engineering solutions that solve complex problems with valuable outcomes, ensuring high-quality, lean designs and implementations. Technical Leadership and Advocacy: Serve as the technical advocate for products, ensuring code integrity, feasibility, and alignment with business and customer goals. Lead requirement analysis, component design, development, unit testing, integrations, and support. Engineering Craftsmanship: Maintain accountability for the integrity of code design, implementation, quality, data, and ongoing maintenance and operations. Stay hands-on, self-driven, and continuously learn new approaches, languages, and frameworks. Create technical specifications, and write high-quality, supportable, scalable code ensuring all quality KPIs are met or exceeded. Demonstrate collaborative skills to work effectively with diverse teams. Customer-Centric Engineering: Develop lean engineering solutions through rapid, inexpensive experimentation to solve customer needs. Engage with customers and product teams before, during, and after delivery to ensure the right solution is delivered at the right time. Incremental and Iterative Delivery: Adopt a mindset that favors action and evidence over extensive planning. Utilize a learning-forward approach to navigate complexity and uncertainty, delivering lean, supportable, and maintainable solutions. Cross-Functional Collaboration and Integration: Work collaboratively with empowered, cross-functional teams including product management, experience, and delivery. Integrate diverse perspectives to make well-informed decisions that balance feasibility, viability, usability, and value. Foster a collaborative environment that enhances team synergy and innovation. Advanced Technical Proficiency: Possess deep expertise in modern software engineering practices and principles, including Agile methodologies and DevSecOps to deliver daily product deployments using full automation from code check-in to production with all quality checks through SDLC lifecycle. Strive to be a role model, leveraging these techniques to optimize solutioning and product delivery. Demonstrate understanding of the full lifecycle product development, focusing on continuous improvement, and learning. Domain Expertise: Quickly acquire domain-specific knowledge relevant to the business or product. Translate business/user needs, architectures, and data designs into technical specifications and code. Be a valuable, flexible, and dedicated team member, supportive of teammates, and focused on quality and tech debt payoff. Effective Communication and Influence: Exhibit exceptional communication skills, capable of articulating complex technical concepts clearly and compellingly. Inspire and influence teammates and product teams through well-structured arguments and trade-offs supported by evidence. Create coherent narratives that align technical solutions with business objectives. Engagement and Collaborative Co-Creation: Engage and collaborate with product engineering teams at all organizational levels, including customers as needed. Build and maintain constructive relationships, fostering a culture of co-creation and shared momentum towards achieving product goals. Align diverse perspectives and drive consensus to create feasible solutions. The team : US Deloitte Technology Product Engineering has modernized software and product delivery, creating a scalable, cost-effective model that focuses on value/outcomes that leverages a progressive and responsive talent structure. As Deloitte’s primary internal development team, Product Engineering delivers innovative digital solutions to businesses, service lines, and internal operations with proven bottom-line results and outcomes. It helps power Deloitte’s success. It is the engine that drives Deloitte, serving many of the world’s largest, most respected companies. We develop and deploy cutting-edge internal and go-to-market solutions that help Deloitte operate effectively and lead in the market. Our reputation is built on a tradition of delivering with excellence. Key Qualifications : A bachelor’s degree in computer science, software engineering, or a related discipline. An advanced degree (e.g., MS) is preferred but not required. Experience is the most relevant factor. Strong data engineering foundation with deep understanding of data-structure, algorithms, code instrumentations, etc. 3+ years proven experience with data ETL and ELT tools (such as ADF, Alteryx, cloud-native tools), data warehousing tools (such as SAP HANA, Snowflake, ADLS, Amazon Redshift, Google Cloud BigQuery). 3+ years of experience with cloud-native engineering, using FaaS/PaaS/micro-services on cloud hyper-scalers like Azure, AWS, and GCP. Strong understanding of methodologies & tools like XP, Lean, SAFe, DevSecOps, SRE, ADO, GitHub, SonarQube, etc. Strong preference will be given to candidates with experience in AI/ML and GenAI. Excellent interpersonal and organizational skills, with the ability to handle diverse situations, complex projects, and changing priorities, behaving with passion, empathy, and care. How You will Grow: At Deloitte, our professional development plans focus on helping people at every level of their career to identify and use their strengths to do their best work every day and excel in everything they do. Our purpose Deloitte’s purpose is to make an impact that matters for our people, clients, and communities. At Deloitte, purpose is synonymous with how we work every day. It defines who we are. Our purpose comes through in our work with clients that enables impact and value in their organizations, as well as through our own investments, commitments, and actions across areas that help drive positive outcomes for our communities. Our people and culture Our inclusive culture empowers our people to be who they are, contribute their unique perspectives, and make a difference individually and collectively. It enables us to leverage different ideas and perspectives, and bring more creativity and innovation to help solve our clients' most complex challenges. This makes Deloitte one of the most rewarding places to work. Professional development At Deloitte, professionals have the opportunity to work with some of the best and discover what works best for them. Here, we prioritize professional growth, offering diverse learning and networking opportunities to help accelerate careers and enhance leadership skills. Our state-of-the-art DU: The Leadership Center in India, located in Hyderabad, represents a tangible symbol of our commitment to the holistic growth and development of our people. Explore DU: The Leadership Center in India . Benefits To Help You Thrive At Deloitte, we know that great people make a great organization. Our comprehensive rewards program helps us deliver a distinctly Deloitte experience that helps that empowers our professionals to thrive mentally, physically, and financially—and live their purpose. To support our professionals and their loved ones, we offer a broad range of benefits. Eligibility requirements may be based on role, tenure, type of employment and/ or other criteria. Learn more about what working at Deloitte can mean for you. Recruiting tips From developing a stand out resume to putting your best foot forward in the interview, we want you to feel prepared and confident as you explore opportunities at Deloitte. Check out recruiting tips from Deloitte recruiters. Requisition code: 306373
Posted 3 weeks ago
5.0 - 8.0 years
0 Lacs
India
Remote
Position: Azure Data Engineer (Offshore - Remote) Experience: 5 - 8 years Start Date: Need to start within 30 days Engagement Type: Full-Time About the Role: Smartbridge is seeking an Azure Data Engineer to design, develop, and optimize data solutions leveraging Microsoft Azure technologies. The ideal candidate will have 5 to 8 years of experience working with Azure Data Factory (ADF), Azure Synapse Analytics, SQL, and ETL processes . Responsibilities: Develop and maintain ETL pipelines using Azure Data Factory (ADF) . Design and implement data models for efficient storage and retrieval in Azure Synapse Analytics . Optimize SQL queries and performance tuning for large datasets. Work with Azure Data Lake, Azure SQL Database, and other cloud data solutions . Implement data security measures , including role-based access control (RBAC) and data masking . Collaborate with business analysts and stakeholders to understand data requirements and translate them into technical solutions. Technical Skills Required: Azure Data Factory (ADF) – Building, orchestrating, and monitoring data pipelines. Azure Synapse Analytics – Data modeling, performance optimization. SQL Server & T-SQL – Writing complex queries, stored procedures. ETL & Data Transformation – Experience handling large datasets. Azure Data Lake & Blob Storage – Managing structured and unstructured data. Power BI or other visualization tools (preferred but not mandatory). Python or Spark (Databricks experience is a plus). Additional Requirements: Bachelor’s degree in Computer Science, Data Engineering, or a related field . Microsoft Azure certification (preferred but not mandatory). Experience in Oil & Gas, Life Science, or Food & Beverage is a plus. 100% remote role – Must be available for the second shift to overlap with the US team until Noon CST . 3-month probation period before full-time confirmation. If certification is not already held, it must be completed during probation. Recruitment Process & Technical Testing: Candidates will undergo a 45-60 minute TestGorilla assessment , including: Intro Video Section – Candidate introduction & motivation. Analytical & Problem-Solving Skills – Scenario-based questions. Technical Test – Covering SQL, Azure Data Engineering-related questions, and possible coding tasks. Join Smartbridge and be part of an innovative team driving cloud data solutions !
Posted 3 weeks ago
4.0 years
0 Lacs
Chennai, Tamil Nadu, India
On-site
IT Analyst: Business Solutions Job #: req33732 Organization: World Bank Sector: Information Technology Grade: GE Term Duration: 4 years 0 months Recruitment Type: Local Recruitment Location: Chennai,India Required Language(s): English Preferred Language(s) Closing Date: 7/23/2025 (MM/DD/YYYY) at 11:59pm UTC Description Do you want to build a career that is truly worthwhile? Working at the World Bank Group provides a unique opportunity for you to help our clients solve their greatest development challenges. The World Bank Group is one of the largest sources of funding and knowledge for developing countries; a unique global partnership of five institutions dedicated to ending extreme poverty, increasing shared prosperity and promoting sustainable development. With 189 member countries and more than 130 offices worldwide, we work with public and private sector partners, investing in groundbreaking projects and using data, research, and technology to develop solutions to the most urgent global challenges. For more information, visit www.worldbank.org ITS Vice Presidency Context The Information and Technology Solutions (ITS) Vice Presidential Unit (VPU) enables the World Bank Group to achieve its mission of ending extreme poverty and boost shared prosperity on a livable planet by delivering transformative information and technologies to its staff working in over 150+ locations. For more information on ITS, see this video:https://www.youtube.com/watch?reload=9&v=VTFGffa1Y7w Our vision is to transform how the Bank Group accomplishes its mission through information and technology. In this fast-paced, ever-changing world, the formulation and implementation of the ITS strategy is an ongoing, iterative process of learning and adaptation developed through extensive consultations with business partners throughout the World Bank Group. ITS shapes its strategy in response to changing business priorities and leverages new technologies to achieve three high-level business outcomes: business enablement, by providing Bank Group units with innovative digital tools and technologies to transform how they deliver value for their clients; empowerment & effectiveness, by ensuring that all Bank Group staff are connected, able to find information, and productive to accelerate the delivery of development solutions globally; and resilience, by equipping the Bank Group to provide risk-based cybersecurity and robust data protection for a global network and a growing cloud platform. Implementation of the strategy is guided by three core principles. The first is to deliver solutions for business partners that are customer-centric, innovative, and transformative. The second is to provide the Bank Group with value for money with selective and standard technologies. The third principle is to excel at the basics by providing a high performing, robust, and resilient IT environment for the organization. ITSFE Context WBG Finance (ITSFI) is responsible for providing high quality, streamlined information and technology solutions for the World Bank’s Financial, which include Corporate Finance, Risk Management, Controls, Treasury, Loans, Accounting, and Concessional finance (handling donor contributions from inception to the point of final disbursement, including IDA, Financial Intermediary Funds and Trust Funds).ITSFI is additionally responsible for building its IT services using a shared platform that provides scale, leverage, reliability, and control while at the same time improving responsiveness to emerging business needs. The ITSFI team is accountable for the implementation of the ITS Strategy supporting WBG core finance business processes. As a unit within the Finance ITS, ITS Financial Engineering (ITSFE) unit provides systems and technology support to mission critical and core financial applications of the WBG finance complex units (DFI, WFA and BPS). The Development Finance IT systems team in ITSFE, provides IT solutions to support WBG development finance & strategic resource mobilization operations for IDA, finance & accounting for Trust Funds, Capital subscription and management processes for IBRD, Externally Funded Output & Variance Reporting & Analysis related processes for IBRD, etc. Duties And Accountabilities The IT Analyst will report to the ITSFE team lead and will work closely with finance clients primarily focusing on supporting the modern cloud-based web applications built for IDA Mobilization, IBRD Corporate Finance (DFCII), Concessional finance & Accounting units. He/She will also be expected to provide support to maintain the business applications systems landscape, cross support enterprise initiatives that have touchpoints with core business processes and co-ordinate the delivery of ongoing maintenance services. His/Her primary responsibilities will include: Support and maintain the new generation of IDA, TF F&A and Corporate Finance cloud-based systems (Azure based web applications, dashboards and reports) with interfaces to SAP and other Enterprise Systems. Establish systems to monitor the operating efficiency of existing application systems and provide (or arrange for) proactive maintenance. Pro-actively liaise with the onsite and offshore teams (IT teams and business clients) to ensure IT support activities are on-track within defined specifications. Facilitate the transition of application support activities for new applications, from original development team to the offshore support team. Develop code, and/or configure and test programs from clear specifications. Prepare documentation of all procedures used in systems. Develop detailed flowcharts to show processing logic for programs. Debug systems and provide daily operational support for production systems, as per the guidance from the team lead. Participate in user training as appropriate. Provide production support and preventive maintenance and on call support as appropriate. Collaborate with business analysts, project managers, and/or clients to analyze and clarify requirements. Assist in the development of technical specification documents. Test own work and contribute to the development of test plans and participate in post-implementation reviews. Understand the organizational mission, values, operations and goals and ITS policies and procedures and serve as a resource to other professionals. Ensure team is compliant and in alignment of the same. Selection Criteria Education: Master's degree with 2 years relevant experience or Bachelors Degree with a minimum of 4 years relevant experience and experience preferably with MDBs. Experience working with geographically distributed teams across different time zones. Proven work experience in implementation and development of cloud-based solutions on an enterprise scale, specifically on Microsoft Azure platforms. Good knowledge of Azure Services (APM, ASE, AKS, Key Vault, SQL Server) and Azure Data platform (ADF, Azure Analysis Services, BLOB, Data Lake, Power BI etc.,). Strong experience in working with custom developed interfaces from Azure cloud solutions to SAP R/3, SAP BW and other applications. Familiarity with industry standard processes defined for systems design, database design, development, testing, and integration phases of a project and agile based implementations. Hands-on experience with Microsoft .Net, Angular, MS SQL, Azure Functions, ASE, AKS, Azure AD process, Azure Data Lake and APIGEE gateway technologies. Track record demonstrating eagerness to learn new things while effectively contributing to the active work program. Demonstrated ability to effectively liaise with stakeholders on requirements, documenting issues, and getting consensus on resolutions. Experience in change management procedures and use of ADO/PLM processes for execution. Proven ability to contribute to projects with minimal guidance. Capacity to work both independently and in a team. Willingness to seek advice and assistance when needed. Excellent communication (oral & written) and collaboration skills. Preferred Additional Experience/Qualifications Knowledge of ERPs & custom development, especially SAP technologies (SAP R/3 & SAP BW) and experience in developing custom applications for the finance domain using Azure PaaS services. Work experience with Multi-lateral Development Banks (MDBs) and/or other international financial institutions. Microsoft Azure certifications in areas of cloud infrastructure, developing services and designing and maintaining financial business data. World Bank Group Core Competencies The World Bank Group offers comprehensive benefits, including a retirement plan; medical, life and disability insurance; and paid leave, including parental leave, as well as reasonable accommodations for individuals with disabilities. We are proud to be an equal opportunity and inclusive employer with a dedicated and committed workforce, and do not discriminate based on gender, gender identity, religion, race, ethnicity, sexual orientation, or disability. Learn more about working at the World Bank and IFC , including our values and inspiring stories.
Posted 3 weeks ago
5.0 - 10.0 years
10 - 20 Lacs
Hyderabad
Work from Office
• Looking for (4-15) years experience in Data Engineer Strong experience in SQL,TSQL,Azure Data Factory (ADF), Databricks. • Good to have experience in SSIS & Python Notice Period-Immediate Email- sachin@assertivebs.com
Posted 3 weeks ago
3.0 - 5.0 years
5 - 8 Lacs
Bengaluru
Remote
As a Senior Azure Data Engineer, your responsibilities will include: Building scalable data pipelines using Databricks and PySpark Transforming raw data into usable business insights Integrating Azure services like Blob Storage, Data Lake, and Synapse Analytics Deploying and maintaining machine learning models using MLlib or TensorFlow Executing large-scale Spark jobs with performance tuning on Spark Pools Leveraging Databricks Notebooks and managing workflows with MLflow Qualifications: Bachelors/Masters in Computer Science, Data Science, or equivalent 7+ years in Data Engineering, with 3+ years in Azure Databricks Strong hands-on in: PySpark, Spark SQL, RDDs, Pandas, NumPy, Delta Lake Azure ecosystem: Data Lake, Blob Storage, Synapse Analytics
Posted 3 weeks ago
1.0 years
0 Lacs
Noida, Uttar Pradesh, India
On-site
Job Title: Associate Campaign Manager Location: Noida Company: Adfluence Hub Industry: Influencer Marketing Employment Type: Full-time About Us: At Adfluence Hub, we pride ourselves on being a leading influencer marketing agency that delivers impactful and authentic campaigns for our clients. We are seeking a dedicated and dynamic Campaign Manager to join our team and contribute to our mission of excellence. Position Overview: We are seeking a highly skilled Campaign Manager to oversee the strategic execution of large-scale micro and nano influencer campaigns. This role requires a detail-oriented and results-driven professional who can manage the full campaign lifecycle, from influencer identification to execution and performance analysis. Key Responsibilities: Influencer Sourcing & Relationship Management: You will be responsible for identifying and onboarding relevant influencers, both micro and macro, ensuring they meet our standards for audience quality and engagement. Building and maintaining a robust network of micro-influencers is crucial for efficient campaign scaling. You'll negotiate competitive pricing, achieve monthly sign-up targets, and cultivate long-term relationships for continued collaboration. Campaign Execution & Coordination: You will develop and execute influencer marketing strategies aligned with client goals, working closely with internal teams to define objectives and timelines. Precision in managing contracts, deliverables, and payments is essential. You’ll ensure brand compliance and oversee all aspects of campaign execution, from content approvals to final rollouts. Analytics & Performance Tracking: Utilizing data-driven insights, you’ll track and analyze campaign performance, focusing on ROI, engagement, and conversions. You’ll leverage analytics tools along with the ADF tech platform to monitor influencer impact and optimize campaigns, delivering post-campaign reports with actionable insights for continuous improvement. Process Optimization & Automation: You will implement streamlined workflows for influencer onboarding and campaign execution, leveraging tools like Google Spreadsheets to automate tracking and reporting. Collaborating with platform and tech teams, you'll enhance influencer recruitment and campaign scalability. Key Performance Indicators (KPIs): Timely Campaign Execution Comprehensive Tracker Maintenance Influencer Satisfaction Levels Campaign Performance Metrics Influencer Onboarding Efficiency Qualifications & Skills: Experience: Minimum 1+ years of experience in influencer marketing, with a focus on micro-influencer campaigns. Experience in the Beauty and Personal Care industry is a plus. Core Competencies: Influencer Relationship Management: Ability to build and maintain strong influencer partnerships. Project Management: Strong organizational and time-management skills, capable of managing multiple campaigns simultaneously. Communication & Negotiation: Excellent verbal and written communication skills, with proven negotiation abilities. Strategic Thinking: Ability to develop and execute data-driven influencer marketing strategies. Data Analysis: the ability to interpret campaign metrics and optimize accordingly. Technical Skills: Proficiency in Google Spreadsheets, analytics tools, basic video editing and email communication platforms. Professional Attributes: Results-driven and highly motivated, with a commitment to achieving campaign objectives. Proactive and adaptable, capable of thriving in a fast-paced environment. Strong attention to detail and a commitment to quality. Ability to work well within a team. Company Culture: At Adfluence Hub, we value creativity, collaboration, and a positive work environment. We believe in fostering growth and development, both professionally and personally, and strive to create an inclusive and supportive workplace for all our team members. How to Apply: If you are passionate about influencer marketing and possess the skills to drive impactful campaigns, we would love to hear from you. Please submit your resume. Join us and be part of a team that values innovation, collaboration, and campaign success!
Posted 3 weeks ago
3.0 years
0 Lacs
Gurgaon
On-site
Gurgaon 1 3+ Years Full Time We are seeking a skilled Data Engineer with strong expertise in Azure Data Factory (ADF), Snowflake, and Kafka (Confluent Platform). The ideal candidate will be responsible for designing, building, and maintaining scalable data pipelines and streaming solutions. Key Responsibilities: Design, develop, and maintain ADF pipelines for data ingestion, transformation, and orchestration. Monitor and troubleshoot pipeline failures and ensure smooth data flow. Write efficient and optimized SQL queries and create complex views in Snowflake. Integrate and manage Kafka producers/consumers and implement real-time data stream processing using the Confluent Platform. Collaborate with data analysts, architects, and software developers to deliver end-to-end data solutions. (Optional) Support backend development using Java 17 and Spring Boot, if required. Candidate Requirements: 3 to 5 years of hands-on experience in: Azure Data Factory (ADF): Pipeline creation, monitoring, troubleshooting. Snowflake: Complex SQL queries, view creation, query optimization. Apache Kafka and Confluent Platform: Stream processing, producer/consumer integration. Good understanding of data modeling, ETL concepts, and cloud-based architectures. Nice to have: Experience in Java 17 and Spring Boot.
Posted 3 weeks ago
12.0 years
1 - 3 Lacs
Hyderābād
On-site
Overview: Seeking a Manager, Data Operations, to support our growing data organization. In this role, you will play a key role in maintaining data pipelines and corresponding platforms (on-prem and cloud) while collaborating with global teams on DataOps initiatives. Manage the day-to-day operations of data pipelines, ensuring governance, reliability, and performance optimization on Microsoft Azure. This role requires hands-on experience with Azure Data Factory (ADF), Azure Synapse Analytics, Azure Databricks, real-time streaming architectures, and DataOps methodologies. Ensure availability, scalability, automation, and governance of enterprise data pipelines supporting analytics, AI/ML, and business intelligence. Support DataOps programs, ensuring alignment with business objectives, data governance standards, and enterprise data strategy. Assist in implementing real-time data observability, monitoring, and automation frameworks to improve data reliability, quality, and operational efficiency. Contribute to the development of governance models and execution roadmaps to optimize efficiency across data platforms, including Azure, AWS, GCP, and on-prem environments. Work on CI/CD integration, data pipeline automation, and self-healing capabilities to enhance enterprise-wide data operations. Collaborate on building and supporting next-generation Data & Analytics platforms while fostering an agile and high-performing DataOps team. Support the adoption of Data & Analytics technology transformations, ensuring full sustainment capabilities and automation for proactive issue identification and resolution. Partner with cross-functional teams to drive process improvements, best practices, and operational excellence within DataOps. Responsibilities: Support the implementation and optimization of enterprise-scale data pipelines using Azure Data Factory (ADF), Azure Synapse Analytics, Azure Databricks, and Azure Stream Analytics. Assist in managing end-to-end data ingestion, transformation, orchestration, and storage workflows, ensuring data reliability, integrity, and availability. Ensure seamless batch, real-time, and streaming data processing while focusing on high availability and fault tolerance. Contribute to DataOps automation initiatives, including CI/CD for data pipelines, automated testing, and version control using Azure DevOps, Terraform, and Infrastructure-as-Code (IaC). Collaborate with Data Engineering, Analytics, AI/ML, CloudOps, and Business Intelligence teams to enable data-driven decision-making. Work with IT, data stewards, and compliance teams to align DataOps practices with regulatory and security requirements. Support data operations and sustainment efforts, including testing and monitoring processes to support global products and projects. Assist in data capture, storage, integration, governance, and analytics initiatives, collaborating with cross-functional teams. Manage day-to-day DataOps activities, ensuring adherence to service-level agreements (SLAs) and business requirements. Engage with SMEs and business stakeholders to align data platform capabilities with business needs. Participate in the Agile work intake and management process to support execution excellence for data platform teams. Collaborate with cross-functional teams to troubleshoot and resolve issues related to cloud infrastructure and data services. Assist in developing and automating operational policies and procedures to improve efficiency and service resilience. Support incident response and root cause analysis, contributing to self-healing mechanisms and mitigation strategies. Foster a customer-centric environment, advocating for operational excellence and continuous service improvements. Contribute to building a collaborative, high-performing team culture focused on automation and efficiency in DataOps. Adapt to shifting priorities and support cross-functional teams in maintaining productivity while meeting business goals. Leverage technical expertise in cloud and data operations to improve service reliability and scalability. Qualifications: 12+ years of technology work experience in a large-scale global organization, with CPG industry experience preferred. 12+ years of experience in Data & Analytics roles, with hands-on expertise in data operations and governance. 8+ years of experience working within a cross-functional IT organization, collaborating with multiple teams. 5+ years of experience in a management or lead role, with a focus on DataOps execution and delivery. Hands-on experience with Azure Data Factory (ADF) for orchestrating data pipelines and ETL workflows. Proficiency in Azure Synapse Analytics, Azure Data Lake Storage (ADLS), and Azure SQL Database. Familiarity with Azure Databricks for large-scale data processing (basic troubleshooting or support scope is sufficient if not engineering-focused). Exposure to cloud environments (AWS, Azure, GCP) and understanding of CI/CD pipelines for data operations. Knowledge of structured and semi-structured data storage formats (e.g., Parquet, JSON, Delta). Excellent communication skills, with the ability to empathize with stakeholders and articulate technical concepts to non-technical audiences. Strong problem-solving abilities, prioritizing customer needs and advocating for operational improvements. Customer-focused mindset, ensuring high-quality service delivery and operational excellence. Growth mindset, with a willingness to learn and adapt to new technologies and methodologies in a fast-paced environment. Experience in supporting mission-critical solutions in a Microsoft Azure environment, including data pipeline automation. Familiarity with Site Reliability Engineering (SRE) practices, such as automated issue remediation and scalability improvements. Experience driving operational excellence in complex, high-availability data environments. Ability to collaborate across teams, fostering strong relationships with business and IT stakeholders. Experience in data management concepts, including master data management, data governance, and analytics. Knowledge of data acquisition, data catalogs, data standards, and data management tools. Strong analytical and strategic thinking skills, with the ability to execute plans effectively and drive results. Proven ability to work in a fast-changing, complex environment, adapting to shifting priorities while maintaining productivity.
Posted 3 weeks ago
5.0 - 10.0 years
0 Lacs
Hyderābād
On-site
Category: Software Development/ Engineering Main location: India, Andhra Pradesh, Hyderabad Position ID: J0725-0450 Employment Type: Full Time Position Description: At CGI, we’re a team of builders. We call our employees members because all who join CGI are building their own company - one that has grown to 72,000 professionals located in 40 countries. Founded in 1976, CGI is a leading IT and business process services firm committed to helping clients succeed. We have the global resources, expertise, stability and dedicated professionals needed to achieve. At CGI, we’re a team of builders. We call our employees members because all who join CGI are building their own company - one that has grown to 72,000 professionals located in 40 countries. Founded in 1976, CGI is a leading IT and business process services firm committed to helping clients succeed. We have the global resources, expertise, stability and dedicated professionals needed to achieve results for our clients - and for our members. Come grow with us. Learn more at www.cgi.com. This is a great opportunity to join a winning team. CGI offers a competitive compensation package with opportunities for growth and professional development. Benefits for full-time, permanent members start on the first day of employment and include a paid time-off program and profit participation and stock purchase plans. We wish to thank all applicants for their interest and effort in applying for this position, however, only candidates selected for interviews will be contacted. No unsolicited agency referrals please. Job Title: Azure Databricks Developer Position: Senior Software Engineer Experience: 5-10 Years Category: Software Development/ Engineering Main location: India, Bangalore / Hyderabad / Chennai Position ID: J0725-0450 Employment Type: Full Time Your future duties and responsibilities: Azure data bricks developer with 5-10 years of experience We are seeking a skilled Azure Databricks Developer to design, develop, and optimize big data pipelines using Databricks on Azure. The ideal candidate will have strong expertise in PySpark, Azure Data Lake, and data engineering best practices in a cloud environment. Key Responsibilities: Design and implement ETL/ELT pipelines using Azure Databricks and PySpark. Work with structured and unstructured data from diverse sources (e.g., ADLS Gen2, SQL DBs, APIs). Optimize Spark jobs for performance and cost-efficiency. Collaborate with data analysts, architects, and business stakeholders to understand data needs. Develop reusable code components and automate workflows using Azure Data Factory (ADF). Implement data quality checks, logging, and monitoring. Participate in code reviews and adhere to software engineering best practices. Required Skills & Qualifications: 3-5 years of experience in Apache Spark / PySpark. 3-5 years working with Azure Databricks and Azure Data Services (ADLS Gen2, ADF, Synapse). Strong understanding of data warehousing, ETL, and data lake architectures. Proficiency in Python and SQL. Experience with Git, CI/CD tools, and version control practices Required qualifications to be successful in this role: Experience: 5 to 10 Yrs Location: Bangalore /Hyderabad / Chennai Education: BE / B.Tech / MCA / BCA Skills: Azure Databricks, Azure Data Factory, SQL, PySpark, Python Skills: ETL SQL What you can expect from us: Together, as owners, let’s turn meaningful insights into action. Life at CGI is rooted in ownership, teamwork, respect and belonging. Here, you’ll reach your full potential because… You are invited to be an owner from day 1 as we work together to bring our Dream to life. That’s why we call ourselves CGI Partners rather than employees. We benefit from our collective success and actively shape our company’s strategy and direction. Your work creates value. You’ll develop innovative solutions and build relationships with teammates and clients while accessing global capabilities to scale your ideas, embrace new opportunities, and benefit from expansive industry and technology expertise. You’ll shape your career by joining a company built to grow and last. You’ll be supported by leaders who care about your health and well-being and provide you with opportunities to deepen your skills and broaden your horizons. Come join our team—one of the largest IT and business consulting services firms in the world.
Posted 3 weeks ago
5.0 years
0 Lacs
Hyderabad, Telangana, India
On-site
Principle Duties And Responsibilities User Management Perform user management for Workday ERP database including Role and Permission management. Workday Cloud Support Manage and implement basic system configuration. Provide functional guidance to developers and the QA team in execution of business processes related to system functions and behaviors. Provide data guidance to developers and the QA team regarding questions on tables and data elements. Manage the database changes associated with upgrades to the Workday ERP software. Define and manage a Disaster Recovery process. Continuously monitor the databases for performance issues. Work with GHX support organizations to remediate database issues. Processes Define and implement a data refresh process and strategy. Define and implement a data de-identification process. Define and implement multiple test environments. Operational Duties Adhere to Change Management guidelines for all system changes and enhancements. Manage database user access. Knowledge And Skills Required Qualifications Bachelor’s degree in Computer Science/Information Technology/Systems or related field or demonstrated equivalent experience. 5+ years of hands-on Workday Cloud Administration and system support experience with mid to large market sized companies. Experience with the following Workday Cloud applications: GL, AP, AR, FA, Cash, Procurement, SSP, BI, SmartView and ADF. 2+ years of hands-on experience in Workday Cloud Database Administration. 2+ years of hands-on experience in a support organization or capacity. 2+ years experience with data refresh and de-identification strategies and implementation. Understanding of Quality Assurance testing practices. Hands on knowledge of Workday Cloud SCM Workflow Required Skills Possess strong business acumen to communicate with and support Sales, Sales Operations, Customer Support, Finance, Accounting, Revenue, Purchasing, and HR as needed in a functional capacity. Possess reasonable technical acumen to allow learning/working in a basic technical/functional capacity in all Corporate Systems platforms. Advanced PC skills including MS Excel, PowerPoint, Outlook, Basic SQL Strong analytical and problem-solving abilities. Strong interpersonal and communication skills. Familiarity of current project management/execution methodologies. Must be tasked oriented with strong organizational and time management skills. Flexible and able to quickly adapt to a dynamic business environment. Ability to effectively communicate (written and verbal) complex solutions and ideas at a level suitable for any level of personnel from basic business users to highly technical developers. Ability to provide excellent customer service and collaborate between teams. Ability to handle workload under time pressure and meet strict deadlines. Flexible and able to quickly adapt to a dynamic business environment. Ability to keep highly sensitive information confidential and be familiar with HIPPA and GDPR regulations. Must be able to manage time using a work queue comprised of ‘issue tickets’ across multiple platforms and perform to published service levels (SLA) and key results (KR) KEY DIFFERENTIATORS Certifications GHX: It's the way you do business in healthcare Global Healthcare Exchange (GHX) enables better patient care and billions in savings for the healthcare community by maximizing automation, efficiency and accuracy of business processes. GHX is a healthcare business and data automation company, empowering healthcare organizations to enable better patient care and maximize industry savings using our world class cloud-based supply chain technology exchange platform, solutions, analytics and services. We bring together healthcare providers and manufacturers and distributors in North America and Europe - who rely on smart, secure healthcare-focused technology and comprehensive data to automate their business processes and make more informed decisions. It is our passion and vision for a more operationally efficient healthcare supply chain, helping organizations reduce - not shift - the cost of doing business, paving the way to delivering patient care more effectively. Together we take more than a billion dollars out of the cost of delivering healthcare every year. GHX is privately owned, operates in the United States, Canada and Europe, and employs more than 1000 people worldwide. Our corporate headquarters is in Colorado, with additional offices in Europe. Disclaimer Global Healthcare Exchange, LLC and its North American subsidiaries (collectively, “GHX”) provides equal employment opportunities (EEO) to all employees and applicants for employment without regard to race, color, national origin, sex, sexual orientation, gender identity, religion, age, genetic information, disability, veteran status or any other status protected by applicable law. All qualified applicants will receive consideration for employment without regard to any status protected by applicable law. This EEO policy applies to all terms, conditions, and privileges of employment, including hiring, training and development, promotion, transfer, compensation, benefits, educational assistance, termination, layoffs, social and recreational programs, and retirement. GHX believes that employees should be provided with a working environment which enables each employee to be productive and to work to the best of his or her ability. We do not condone or tolerate an atmosphere of intimidation or harassment based on race, color, national origin, sex, sexual orientation, gender identity, religion, age, genetic information, disability, veteran status or any other status protected by applicable law. GHX expects and requires the cooperation of all employees in maintaining a discrimination and harassment-free atmosphere. Improper interference with the ability of GHX’s employees to perform their expected job duties is absolutely not tolerated.
Posted 3 weeks ago
0 years
6 - 9 Lacs
Calcutta
On-site
Ready to shape the future of work? At Genpact, we don’t just adapt to change—we drive it. AI and digital innovation are redefining industries, and we’re leading the charge. Genpact’s AI Gigafactory , our industry-first accelerator, is an example of how we’re scaling advanced technology solutions to help global enterprises work smarter, grow faster, and transform at scale. From large-scale models to agentic AI , our breakthrough solutions tackle companies’ most complex challenges. If you thrive in a fast-moving, tech-driven environment, love solving real-world problems, and want to be part of a team that’s shaping the future, this is your moment. Genpact (NYSE: G) is an advanced technology services and solutions company that delivers lasting value for leading enterprises globally. Through our deep business knowledge, operational excellence, and cutting-edge solutions – we help companies across industries get ahead and stay ahead. Powered by curiosity, courage, and innovation , our teams implement data, technology, and AI to create tomorrow, today. Get to know us at genpact.com and on LinkedIn , X , YouTube , and Facebook . Inviting applications for the role of Senior Principal Consultant - Sr. Integration architect & Lead (Snowflake, Seeburger, ADF) Responsibilities Architect modern data solutions using Snowflake, Seeburger, and Azure Data Factory (ADF) Create and review solution design artifacts for Snowflake and ADF platforms Promote modular architecture by designing reusable ADF pipelines and Snowflake models Ensure compliance with data governance and Seeburger-based B2B integration standards Guide teams on best practices for ADF orchestration and Snowflake performance optimization Validate Seeburger message flows and ADF workflows during data infrastructure build phases Define operational roles for managing Snowflake environments and Seeburger integrations Identify and address architectural risks across ADF pipelines and Snowflake compute layers Evaluate design options across Snowflake and ADF for performance, cost, and scalability Lead technical strategy to build reusable components using ADF, Snowflake, and Seeburger Qualifications we seek in you! Minimum Qualifications Bachelor's degree in information science , data management , computer science or related field preferred Should have experience in Data Engineering domain Should have experience on Snowflake, ADF, Seeburger and components Strong technical architecture skills with proven experience designing end-to-end solutions using Snowflake, Seeburger, and Azure Data Factory Proven experience architecting end-to-end solutions using Snowflake, Seeburger, and Azure Data Factory Strong communication skills to lead global teams delivering Snowflake platforms and Seeburger integrations Experience in designing data solutions for supply chain management and EDI transactions In-depth knowledge of IT delivery models and cloud lifecycles for ADF and Snowflake deployments Hands-on expertise with Seeburger message flows and operational ADF pipeline management Leadership in defining architectural direction and promoting reuse across Snowflake and ADF components Should have designed the E2E architecture of unified data platform covering Data Ingestion, Transformation, Serve, and Consumption using tools like Snowflake, Seeburger, and Azure Data Factory Should have designed and implemented at least 2-3 projects end-to-end in Snowflake and ADF Should have hands-on experience in Snowflake workflows orchestration, security management, platform governance, and data security Why join Genpact? Be a transformation leader – Work at the cutting edge of AI, automation, and digital innovation Make an impact – Drive change for global enterprises and solve business challenges that matter Accelerate your career – Get hands-on experience, mentorship, and continuous learning opportunities Work with the best – Join 140,000+ bold thinkers and problem-solvers who push boundaries every day Thrive in a values-driven culture – Our courage, curiosity, and incisiveness - built on a foundation of integrity and inclusion - allow your ideas to fuel progress Come join the tech shapers and growth makers at Genpact and take your career in the only direction that matters: Up. Let’s build tomorrow together. Genpact is an Equal Opportunity Employer and considers applicants for all positions without regard to race, color , religion or belief, sex, age, national origin, citizenship status, marital status, military/veteran status, genetic information, sexual orientation, gender identity, physical or mental disability or any other characteristic protected by applicable laws. Genpact is committed to creating a dynamic work environment that values respect and integrity, customer focus, and innovation. Furthermore, please do note that Genpact does not charge fees to process job applications and applicants are not required to pay to participate in our hiring process in any other way. Examples of such scams include purchasing a 'starter kit,' paying to apply, or purchasing equipment or training. Job Senior Principal Consultant Primary Location India-Kolkata Schedule Full-time Education Level Master's / Equivalent Job Posting Jul 8, 2025, 11:37:30 PM Unposting Date Ongoing Master Skills List Digital Job Category Full Time
Posted 3 weeks ago
0 years
0 Lacs
Pune, Maharashtra, India
On-site
Design, deploy and manage Azure infrastructure including virtual machines, storage accounts, virtual networks, and other resources. Assist teams by deploying applications to AKS clusters using containerization technologies such as Docker and Kubernetes, Container Registry, etc.. Familiarity with the Azure CLI and ability to use PowerShell say to scan Azure resources, make modifications, spit out a report or a dump, etc.. Setting up a 2 or 3 tier application on Azure. VMs, web apps, load balancers, proxies, etc.… Well versed with security, AD, MI SPN, firewalls Networking: NSGs, VNETs, private end points, express routes, Bastion, etc.… Familiarity with a scripting language like Python for automation. Leveraging Terraform (or Bicep) for automating infrastructure deployment. Cost tracking, analysis, reporting, and management at the resource groups level. Experience with Azure DevOps Experience with Azure monitor Strong hands-on experience in ADF, Linked Service/IR (Self-hosted/managed), LogicApp, ServiceBus, Databricks, SQL Server Strong understanding of Python, Spark, and SQL (Nice to have) Ability to work in fast paced environments as we have tight SLAs for tickets. Self-driven and should possess exploratory mindset as the work requires a good amount of research (within and outside the application)
Posted 3 weeks ago
3.0 - 5.0 years
0 Lacs
Gurgaon, Haryana, India
On-site
We are seeking a skilled Data Engineer with strong expertise in Azure Data Factory (ADF), Snowflake, and Kafka (Confluent Platform). The ideal candidate will be responsible for designing, building, and maintaining scalable data pipelines and streaming solutions. Key Responsibilities Design, develop, and maintain ADF pipelines for data ingestion, transformation, and orchestration. Monitor and troubleshoot pipeline failures and ensure smooth data flow. Write efficient and optimized SQL queries and create complex views in Snowflake. Integrate and manage Kafka producers/consumers and implement real-time data stream processing using the Confluent Platform. Collaborate with data analysts, architects, and software developers to deliver end-to-end data solutions. (Optional) Support backend development using Java 17 and Spring Boot, if required. Candidate Requirements 3 to 5 years of hands-on experience in: Azure Data Factory (ADF): Pipeline creation, monitoring, troubleshooting. Snowflake: Complex SQL queries, view creation, query optimization. Apache Kafka and Confluent Platform: Stream processing, producer/consumer integration. Good understanding of data modeling, ETL concepts, and cloud-based architectures. Nice to have: Experience in Java 17 and Spring Boot. APPLY NOW
Posted 3 weeks ago
5.0 years
0 Lacs
Greater Kolkata Area
On-site
Skill required: Tech for Operations - Microsoft Azure Cloud Services Designation: App Automation Eng Senior Analyst Qualifications: Any Graduation/12th/PUC/HSC Years of Experience: 5 to 8 years About Accenture Accenture is a global professional services company with leading capabilities in digital, cloud and security.Combining unmatched experience and specialized skills across more than 40 industries, we offer Strategy and Consulting, Technology and Operations services, and Accenture Song— all powered by the world’s largest network of Advanced Technology and Intelligent Operations centers. Our 699,000 people deliver on the promise of technology and human ingenuity every day, serving clients in more than 120 countries. We embrace the power of change to create value and shared success for our clients, people, shareholders, partners and communities.Visit us at www.accenture.com What would you do? Accenture is a global professional services company with leading capabilities in digital, cloud and security. Combining unmatched experience and specialized skills across more than 40 industries, we offer Strategy and Consulting, Technology and Operations services, and Accenture Song— all powered by the world s largest network of Advanced Technology and Intelligent Operations centers. Our 699,000 people deliver on the promise of technology and human ingenuity every day, serving clients in more than 120 countries. We embrace the power of change to create value and shared success for our clients, people, shareholders, partners and communities. Visit us at www.accenture.com. In our Service Supply Chain offering, we leverage a combination of proprietary technology and client systems to develop, execute, and deliver BPaaS (business process as a service) or Managed Service solutions across the service lifecycle: Plan, Deliver, and Recover. In this role, you will partner with business development and act as a Business Subject Matter Expert (SME) to help build resilient solutions that will enhance our clients supply chains and customer experience. The Senior Azure Data factory (ADF) Support Engineer Il will be a critical member of our Enterprise Applications Team, responsible for designing, supporting & maintaining robust data solutions. The ideal candidate is proficient in ADF, SQL and has extensive experience in troubleshooting Azure Data factory environments, conducting code reviews, and bug fixing. This role requires a strategic thinker who can collaborate with cross-functional teams to drive our data strategy and ensure the optimal performance of our data systems. What are we looking for? Bachelor s or Master s degree in Computer Science, Information Technology, or a related field. Proven experience (5+ years) as a Azure Data Factory Support Engineer Il Expertise in ADF with a deep understanding of its data-related libraries. Strong experience in Azure cloud services, including troubleshooting and optimizing cloud-based environments. Proficient in SQL and experience with SQL database design. Strong analytical skills with the ability to collect, organize, analyze, and disseminate significant amounts of information with attention to detail and accuracy. Experience with ADF pipelines. Excellent problem-solving and troubleshooting skills. Experience in code review and debugging in a collaborative project setting. Excellent verbal and written communication skills. Ability to work in a fast-paced, team-oriented environment. Strong understanding of the business and a passion for the mission of Service Supply Chain Hands on with Jira, Devops ticketing, ServiceNow is good to have Roles and Responsibilities: Innovate. Collaborate. Build. Create. Solve ADF & associated systems Ensure systems meet business requirements and industry practices. Integrate new data management technologies and software engineering tools into existing structures. Recommend ways to improve data reliability, efficiency, and quality. Use large data sets to address business issues. Use data to discover tasks that can be automated. Fix bugs to ensure robust and sustainable codebase. Collaborate closely with the relevant teams to diagnose and resolve issues in data processing systems, ensuring minimal downtime and optimal performance. Analyze and comprehend existing ADF data pipelines, systems, and processes to identify and troubleshoot issues effectively. Develop, test, and implement code changes to fix bugs and improve the efficiency and reliability of data pipelines. Review and validate change requests from stakeholders, ensuring they align with system capabilities and business objectives. Implement robust monitoring solutions to proactively detect and address issues in ADF data pipelines and related infrastructure. Coordinate with data architects and other team members to ensure that changes are in line with the overall architecture and data strategy. Document all changes, bug fixes, and updates meticulously, maintaining clear and comprehensive records for future reference and compliance. Provide technical guidance and support to other team members, promoting a culture of continuous learning and improvement. Stay updated with the latest technologies and practices in ADF to continuously improve the support and maintenance of data systems. Flexible Work Hours to include US Time Zones Flexible working hours however this position may require you to work a rotational On-Call schedule, evenings, weekends, and holiday shifts when need arises Participate in the Demand Management and Change Management processes. Work in partnership with internal business, external 3rd party technical teams and functional teams as a technology partner in communicating and coordinating delivery of technology services from Technology For Operations (TfO), Any Graduation,12th/PUC/HSC
Posted 3 weeks ago
2.0 years
0 Lacs
Kochi, Kerala, India
On-site
Job Title - + + Management Level: Location: Kochi, Coimbatore, Trivandrum Must have skills: Python/Scala, Pyspark/Pytorch Good to have skills: Redshift Job Summary You’ll capture user requirements and translate them into business and digitally enabled solutions across a range of industries. Your responsibilities will include: Responsibilities Roles and Responsibilities Designing, developing, optimizing, and maintaining data pipelines that adhere to ETL principles and business goals Solving complex data problems to deliver insights that helps our business to achieve their goals. Source data (structured→ unstructured) from various touchpoints, format and organize them into an analyzable format. Creating data products for analytics team members to improve productivity Calling of AI services like vision, translation etc. to generate an outcome that can be used in further steps along the pipeline. Fostering a culture of sharing, re-use, design and operational efficiency of data and analytical solutions Preparing data to create a unified database and build tracking solutions ensuring data quality Create Production grade analytical assets deployed using the guiding principles of CI/CD. Professional And Technical Skills Expert in Python, Scala, Pyspark, Pytorch, Javascript (any 2 at least) Extensive experience in data analysis (Big data- Apache Spark environments), data libraries (e.g. Pandas, SciPy, Tensorflow, Keras etc.), and SQL. 2-3 years of hands-on experience working on these technologies. Experience in one of the many BI tools such as Tableau, Power BI, Looker. Good working knowledge of key concepts in data analytics, such as dimensional modeling, ETL, reporting/dashboarding, data governance, dealing with structured and unstructured data, and corresponding infrastructure needs. Worked extensively in Microsoft Azure (ADF, Function Apps, ADLS, Azure SQL), AWS (Lambda,Glue,S3), Databricks analytical platforms/tools, Snowflake Cloud Datawarehouse. Additional Information Experience working in cloud Data warehouses like Redshift or Synapse Certification in any one of the following or equivalent AWS- AWS certified data Analytics- Speciality Azure- Microsoft certified Azure Data Scientist Associate Snowflake- Snowpro core- Data Engineer Databricks Data Engineering About Our Company | Accenture , Experience: 3.5 -5 years of experience is required Educational Qualification: Graduation (Accurate educational details should capture)
Posted 3 weeks ago
0.0 - 2.0 years
5 - 12 Lacs
Pune, Maharashtra
On-site
Company name: PibyThree consulting Services Pvt Ltd. Location : Baner, Pune Start date : ASAP Job Description : We are seeking an experienced Data Engineer to join our team. The ideal candidate will have hands-on experience with Azure Data Factory (ADF), Snowflake, and data warehousing concepts. The Data Engineer will be responsible for designing, developing, and maintaining large-scale data pipelines and architectures. Key Responsibilities: Design, develop, and deploy data pipelines using Azure Data Factory (ADF) Work with Snowflake to design and implement data warehousing solutions Collaborate with cross-functional teams to identify and prioritize data requirements Develop and maintain data architectures, data models, and data governance policies Ensure data quality, security, and compliance with regulatory requirements Optimize data pipelines for performance, scalability, and reliability Troubleshoot data pipeline issues and implement fixes Stay up-to-date with industry trends and emerging technologies in data engineering Requirements: 4+ years of experience in data engineering, with a focus on cloud-based data platforms (Azure preferred) 2+ years of hands-on experience with Azure Data Factory (ADF) 1+ year of experience working with Snowflake Strong understanding of data warehousing concepts, data modeling, and data governance Experience with data pipeline orchestration tools such as Apache Airflow or Azure Databricks Proficiency in programming languages such as Python, Java, or C# Experience with cloud-based data storage solutions such as Azure Blob Storage or Amazon S3 Strong problem-solving skills and attention to detail Job Type: Full-time Pay: ₹500,000.00 - ₹1,200,000.00 per year Schedule: Day shift Ability to commute/relocate: Pune, Maharashtra: Reliably commute or planning to relocate before starting work (Preferred) Education: Bachelor's (Preferred) Experience: total work: 4 years (Preferred) Pyspark: 2 years (Required) Azure Data Factory: 2 years (Required) Databricks: 2 years (Required) Work Location: In person
Posted 3 weeks ago
5.0 - 10.0 years
14 - 24 Lacs
Bengaluru
Work from Office
Role & Responsibilities: Within the technical team and under the guidance of the Team Manager, you will: Be in charge of installing, configuring, and upgrading/patching the Product applications internally Handle and follow up technical issues (wide diversity and complexity) and perform corrective actions Interact actively with the functional and technical teams (including development and architecture) located around the globe Provide advice for choices and implementation of interfaces / surrounds (inbound, and outbound), including advice and support on how to develop client reporting Propose solutions to address client challenges Provide on call support (24x7) on rotation basis and weekend/holiday support Support in shifts on rotation basis Contribute towards the Technical Knowledgebase (preparation of documents / presentations on related topics) Provide training, guidance and support to client IT teams Job Description: Good skills in Oracle Fusion Middleware 11g /12c (Forms & Report, ADF, BI Publisher, Oracle Identify Management (OID/OAM)) Good skills in handling middleware vulnerabilities and security (CVE) related queries Excellent analytical and logical skills Ability to address problems with methodology Ability to anticipate client needs and be proactive Strong motivation to continuously increase quality and efficiency Good presentation and communication skills Autonomous, rigorous, and well organized Capability to work within a global team (spread across geographies) and interact with different teams Willingness to work in rotational shifts Quick learner and keen to learn SQL, PL/SQL and Oracle database and new technologies Good to have Knowledge of SQL, PL/SQL and Oracle database
Posted 3 weeks ago
4.0 - 8.0 years
5 - 15 Lacs
Chennai, Delhi / NCR, Mumbai (All Areas)
Hybrid
Job Description (JD): Azure Databricks / ADF / Synapse , with strong emphasis on Python, SQL, Data Lake, and Data Warehouse : Job Title: Data Engineer Azure (Databricks / ADF / Synapse) Experience: 4 to 7 Years Location: Pan India Employment Type: Full-Time Notice Period: Immediate to 30 Days Job Summary: We are looking for a skilled and experienced Data Engineer with 4 to 8 years of experience in building scalable data solutions on the Microsoft Azure ecosystem . The ideal candidate must have strong hands-on experience with Azure Databricks , Azure Data Factory (ADF) , or Azure Synapse Analytics , along with Python and SQL expertise. Familiarity with Data Lake , Data Warehouse concepts, and end-to-end data pipelines is essential. Key Responsibilities: Requirement gathering and analysis Experience with different databases like Synapse, SQL DB, Snowflake etc. Design and implement data pipelines using Azure Data Factory, Databricks, Synapse Create and manage Azure SQL Data Warehouses and Azure Cosmos DB databases Extract, transform, and load (ETL) data from various sources into Azure Data Lake Storage Implement data security and governance measures Monitor and optimize data pipelines for performance and efficiency Troubleshoot and resolve data engineering issues Provide optimized solution for any problem related to data engineering Ability to work with a variety of sources like Relational DB, API, File System, Realtime streams, CDC etc. Strong knowledge on Databricks, Delta tables Required Skills: 48 years of experience in Data Engineering or related roles. Hands-on experience in Azure Databricks , ADF , or Synapse Analytics Proficiency in Python for data processing and scripting. Strong command over SQL writing complex queries, performance tuning, etc. Experience working with Azure Data Lake Storage and Data Warehouse concepts (e.g., dimensional modeling, star/snowflake schemas). Understanding CI/CD practices in a data engineering context. Excellent problem-solving and communication skills. Good to Have: Experienced in Delta Lake , Power BI , or Azure DevOps . Knowledge of Spark , Scala , or other distributed processing frameworks. Exposure to BI tools like Power BI , Tableau , or Looker . Familiarity with data security and compliance in the cloud. Experience in leading a development team.
Posted 3 weeks ago
6.0 - 8.0 years
12 - 18 Lacs
Hyderabad, Bengaluru
Work from Office
Job Title: Oracle Fusion Functional Consultant Cash Management & Lease Accounting Location: Hyderabad / Bangalore Experience: 6-8 Years Department: Oracle ERP Finance Job Summary We are seeking an experienced Oracle Fusion Functional Consultant specializing in Cash Management and Lease Accounting, with strong functional knowledge and hands-on expertise in Fusion Financials. The ideal candidate should have 23 end-to-end implementation and/or support cycles, with a preference for candidates who have previously held Financial Functional Lead roles. Familiarity with Oracle Cloud tools, workflow processes, and prior EBS experience is highly desirable. Key Responsibilities Lead or support implementations of Oracle Fusion Cash Management and Lease Accounting modules. Collaborate with business stakeholders to gather requirements and translate them into functional specifications. Write functional design documents (MD50), test scripts, and support OTBI reports & Fusion analytics. Work on Oracle workflow processes and assist technical teams with integrations and reporting needs. Leverage FSM (Functional Setup Manager) and ADF (Application Development Framework) for configurations and issue resolution. Use Data Integration (DI) tools for mass data uploads and validations. Engage in testing, data migration, UAT, and post-go-live support. Ensure compliance with Oracle Cloud best practices and security standards. Required Skills & Experience 23 implementations or support projects in Oracle Fusion Cash Management & Lease Accounting. Strong hands-on knowledge in Oracle Fusion Financials. Experience with writing functional specs, working on OTBI (Oracle Transactional Business Intelligence) and Fusion Analytics. Solid understanding of workflow processes and how to configure them in Oracle Cloud. Familiarity with Oracle FSM, ADF tools, and Data Integration (DI) tools. Prior experience in Oracle EBS (Financials). Proven ability to work with cross-functional teams and technical counterparts. Strong communication, documentation, and stakeholder management skills Preferred Qualifications Experience in a Financial Functional Lead role in past projects. Oracle Financials Cloud Certification preferred (e.g., General Ledger, Payables, Receivables). Exposure to multi-currency, intercompany, and bank reconciliation processes. Familiarity with Agile/Hybrid project methodologies.
Posted 3 weeks ago
6.0 - 11.0 years
25 - 35 Lacs
Bengaluru
Hybrid
We are hiring Azure Data Engineers for an active project-Bangalore location Interested candidates can share details on the mail with their updated resume. Total Exp? Rel exp in Azure Data Engineering? Current organization? Current location? Current fixed salary? Expected Salary? Do you have any offers? if yes mention the offer you have and reason for looking for more opportunity? Open to relocate Bangalore? Notice period? if serving/not working, mention your LWD? Do you have PF account ?
Posted 3 weeks ago
6.0 - 11.0 years
12 - 22 Lacs
Pune, Gurugram, Bengaluru
Work from Office
Warm welcome from SP Staffing Services! Reaching out to you regarding permanent opportunity !! Job Description: Exp: 6-12 yrs Location: PAN India Skill: Azure Data Factory/SSIS Interested can share your resume to sangeetha.spstaffing@gmail.com with below inline details. Full Name as per PAN: Mobile No: Alt No/ Whatsapp No: Total Exp: Relevant Exp in Data Factory: Rel Exp in Synapse: Rel Exp in SSIS: Rel Exp in Python/Pyspark: Current CTC: Expected CTC: Notice Period (Official): Notice Period (Negotiable)/Reason: Date of Birth: PAN number: Reason for Job Change: Offer in Pipeline (Current Status): Availability for virtual interview on weekdays between 10 AM- 4 PM(plz mention time): Current Res Location: Preferred Job Location: Whether educational % in 10th std, 12th std, UG is all above 50%? Do you have any gaps in between your education or Career? If having gap, please mention the duration in months/year:
Posted 3 weeks ago
Upload Resume
Drag or click to upload
Your data is secure with us, protected by advanced encryption.
Browse through a variety of job opportunities tailored to your skills and preferences. Filter by location, experience, salary, and more to find your perfect fit.
We have sent an OTP to your contact. Please enter it below to verify.
Accenture
39581 Jobs | Dublin
Wipro
19070 Jobs | Bengaluru
Accenture in India
14409 Jobs | Dublin 2
EY
14248 Jobs | London
Uplers
10536 Jobs | Ahmedabad
Amazon
10262 Jobs | Seattle,WA
IBM
9120 Jobs | Armonk
Oracle
8925 Jobs | Redwood City
Capgemini
7500 Jobs | Paris,France
Virtusa
7132 Jobs | Southborough